******************************************************* * 20_Simulation.do ** Simulating the evolution of carbon content computations ** Source Data: Destatis IO, yr = 2018, state of evaluation: 2020, published 2021 ** Volkswirtschaftliche Gesamtrechnung -- Input Output-Rechnung ** 71 Product groups ** Author: Ulf von Kalckreuth ** This version: 07.01.2024, based on the version dated 10.01.2022 ******************************************************** clear cap clear all cap clear * mata set maxvar 11000 set matsize 11000 set more off cap log close window manage maintitle "Preliminary" global code "Y:\S_0-30\Projekte\Sustainable_Finance\Dofiles\LAJCB" global input "Y:\S_0-30\Projekte\Sustainable_Finance\Simulation" **global log "Y:\S_0-30\Projekte\Sustainable_Finance\Logs\LAJCB" global log "C:\Users\s3504uk\LAJCBR\Logs" ** global data_out "Y:\S_0-30\Projekte\Sustainable_Finance\Processed\LAJCB" ** global sim_out "Y:\S_0-30\Projekte\Sustainable_Finance\Simulation\LAJCB" global sim_out "C:\Users\s3504uk\LAJCBR\Files" log using "$log\20_Simulation_1-8", t replace ** quietly putexcel set "$data_out\sectoral cc", replace ** Create a macro to define group labels and names local groups = "g1 g2 g3 g4 g5 g6 g7 g8 g9 g10 " + /* */ "g11 g12 g13 g14 g15 g16 g17 g18 g19 g20 " + /* */ "g21 g22 g23 g24 g25 g26 g27 g28 g29 g30 " + /* */ "g31 g32 g33 g34 g35 g36 g37 g38 g39 g40 " + /* */ "g41 g42 g43 g44 g45 g46 g47 g48 g49 g50 " + /* */ "g51 g52 g53 g54 g55 g56 g57 g58 g59 g60 " + /* */ "g61 g62 g63 g64 g65 g66 g67 g68 g69 g70 " + /* */ "g71" label define prodgroups /* */ 1 "CPA 01" /* */ 2 "CPA 02" /* */ 3 "CPA 03" /* */ 4 "CPA 05" /* */ 5 "CPA 06" /* */ 6 "CPA 07-09" /* */ 7 "CPA 10-12" /* */ 8 "CPA 13-15" /* */ 9 "CPA 16" /* */ 10 "CPA 17" /* */ 11 "CPA 18" /* */ 12 "CPA 19" /* */ 13 "CPA 20" /* */ 14 "CPA 21" /* */ 15 "CPA 22" /* */ 16 "CPA 23.1" /* */ 17 "CPA 23.2-23.9" /* */ 18 "CPA 24.1-24.3" /* */ 19 "CPA 24.4" /* */ 20 "CPA 24.5" /* */ 21 "CPA 25" /* */ 22 "CPA 26" /* */ 23 "CPA 27" /* */ 24 "CPA 28" /* */ 25 "CPA 29" /* */ 26 "CPA 30" /* */ 27 "CPA 31-32" /* */ 28 "CPA 33" /* */ 29 "CPA 35.1, 35.3" /* */ 30 "CPA 35.2" /* */ 31 "CPA 36" /* */ 32 "CPA 37-39" /* */ 33 "CPA 41" /* */ 34 "CPA 42" /* */ 35 "CPA 43" /* */ 36 "CPA 45" /* */ 37 "CPA 46" /* */ 38 "CPA 47" /* */ 39 "CPA 49" /* */ 40 "CPA 50" /* */ 41 "CPA 51" /* */ 42 "CPA 52" /* */ 43 "CPA 53" /* */ 44 "CPA 55-56" /* */ 45 "CPA 58" /* */ 46 "CPA 59-60" /* */ 47 "CPA 61" /* */ 48 "CPA 62-63" /* */ 49 "CPA 64" /* */ 50 "CPA 65" /* */ 51 "CPA 66" /* */ 52 "CPA 68" /* */ 53 "CPA 69-70" /* */ 54 "CPA 71" /* */ 55 "CPA 72" /* */ 56 "CPA 73" /* */ 57 "CPA 74-75" /* */ 58 "CPA 77" /* */ 59 "CPA 78" /* */ 60 "CPA 79" /* */ 61 "CPA 80-82" /* */ 62 "CPA 84.1-84.2" /* */ 63 "CPA 84.3" /* */ 64 "CPA 85" /* */ 65 "CPA 86" /* */ 66 "CPA 87-88" /* */ 67 "CPA 90-92" /* */ 68 "CPA 93" /* */ 69 "CPA 94" /* */ 70 "CPA 95" /* */ 71 "CPA 96" ***************************************************** ** Loading data ***************************************************** ** Load sectoral IO coefficient matrix use $input\71group_coeff_mat_2018.dta, clear mkmat g1-g71, matrix(io_coeff) matname io_coeff `groups' mat io_coeff = io_coeff/100 // coefficients originally given as percentages mat list io_coeff ** Load direct emissions use $input\dir_emissions_2018.dta, clear ** Vector of direct emission quantities, in 1000 tons mkmat co2_quant, matrix(co2_quant) mat rownames co2_quant = `groups' ** Vector of direct emission intensities in kg/Euro mkmat co2_int, matrix(co2_int) mat rownames co2_int = `groups' ** Load sectoral use values from IO tables use $input\use_data_2018.dta, clear ** Vector total use national production plus imports, in Mill € mkmat tot_use_all, matrix(tot_use) mat rownames tot_use = `groups' ** Vector total use national production, in Mill € mkmat tot_use_nat, matrix(nat_prod) mat rownames nat_prod = `groups' ** Vector total use imports, in Mill € mkmat tot_use_imp, matrix(imp_prod) mat rownames imp_prod = `groups' ** Vector final use, national production plus imports, in Mill € mkmat fin_use_all, matrix(fin_use) mat rownames fin_use = `groups' ******************************************** ** I. COMPUTATIONS ON A SECTORAL BASIS ******************************************** ** Compute the Leontief inverse mat leontief = inv(I(71)-io_coeff) mat list leontief ** Compute the matrix power series expansion mat itera = I(71) forvalues i = 1/20 { di di "Iteration ", `i' matrix itera = I(71) + itera * io_coeff ** matrix list itera di "Relative difference to Leontief inverse: ", mreldif(itera, leontief) di } di "Power series expansion of Leontief matrix, iteration ", `i' matrix list itera mat check = itera - leontief di "Remaining difference to Leontief matrix ", `i' mat list check ** Compute sectoral carbon costs based on emission intensities using Leontief inverse mat cc_group = (co2_int' * leontief)' mat list cc_group ** Compute sectoral carbon costs by expansion mat itera = co2_int forvalues i = 1/20 { di di "Iteration ", `i' matrix itera = (co2_int' + itera' * io_coeff)' ** matrix list itera matrix diff = itera - cc_group } di "Power series expansion of carbon costs, iteration ", `i' matrix list itera di "Remaining difference to true carbon costs: " matrix list diff ********************************************************************************* ** Compute weighted sector average of carbon costs ********************************************************************************* local numgroup = rowsof(tot_use) // number of products ** 1) Compute average using total use weights ** Remark: the average based on total use weights should correspond to the average for products in simulation scalar sumweights = 0 scalar sum = 0 forvalues i = 1/`numgroup' { scalar sum = sum + cc_group[`i',1]*tot_use[`i',1] scalar sumweights = sumweights + tot_use[`i',1] } scalar average = sum/sumweights di "Average sector carbon costs, total use weights: ", average ** 2) Compute average using final use weights ** Remark: the average based on final use weights should correspond to the carbon content of final use in Germany, Destatis (2018) scalar sumweights = 0 scalar sum = 0 forvalues i = 1/`numgroup' { scalar sum = sum + cc_group[`i',1]*fin_use[`i',1] scalar sumweights = sumweights + fin_use[`i',1] } scalar average = sum/sumweights di "Average sector carbon costs, final use weights: ", average scalar drop sum sumweights average ********************************************************************************* ** Two checks for industry level carbon content ********************************************************************************* ** 1) Compute carbon content of final use of sector products -- to be compared with Destatis (2018) mat cc_final = hadamard(cc_group, fin_use) mat sum_cc_final = J(1,`numgroup',1) * cc_final // sum the elements of cc_final di "Carbon content of final demand by sector: " matrix list cc_final di "Carbon content of overall final use: " matrix list sum_cc_final ** 2) Compute carbon content of imports and compare with carbon content of final use and the sum of direct emissions. mat cc_import = hadamard(cc_group, imp_prod) mat sum_cc_import = J(1,`numgroup',1) * cc_import // sum the elements of cc_import di "Carbon content of imports by sector: " matrix list cc_import di "Carbon content of overall import: " matrix list sum_cc_import mat sum_direct = J(1,`numgroup',1) * co2_quant di "direct carbon emissions: " matrix list sum_direct di "Check identity:" matrix check = sum_cc_final - sum_cc_import - sum_direct matrix list check matrix drop check cc_final sum_cc_final cc_import sum_cc_import sum_direct ********************************************************************************************************* ** II. GENERATE PRODUCT LEVEL BILLS OF MATERIAL (BOM), DIRECT EMISSIONS AND INITIAL VALUES FOR CARBON CONTENTS ********************************************************************************************************* ** Create vector with numbers of products by product group ******************************************************************************************************************************************************* local aggreg = 1000 // Governs the size of the BOM matrix. IO figures are in millions. Total use figures are divided by // aggreg and the number of firms per sector is equal to the rounded integer production value after dividing. // A product/firm in the BOM thus represents a production value of aggreg/1000 bill Euro // A value of 1000 will produce 7698 products and take around 1:25 h of run time ******************************************************************************************************************************************************* mat numprod = J(`numgroup',1,0) // initiate mat list numprod forvalues i = 1/`numgroup' { scalar num = round(tot_use[`i',1]/`aggreg') if num == 0 { scalar num = 1 // We want at least one product of every sector } mat numprod[`i',1] = num } mat rownames numprod = `groups' display " Number of products by product group" mat list numprod mat sumprod = numprod'*J(`numgroup',1,1) local sumprod = sumprod[1,1] mat drop sumprod display "Number of products, each representing a production value of around", `aggreg'/1000, "bill €: ", `sumprod' ** Create vector indicating the product group each product belongs to forvalues i = 1/`numgroup' { // product of group i mat prodgroup_i = J(numprod[`i',1],1, `i') mat prodgroup = nullmat(prodgroup)\prodgroup_i // concatenating product group indcators } display "Check: rows product group labels: ", rowsof(prodgroup) ************************************************************************** ** Direct emissions vector ************************************************************************** ** Create vector average direct emissions from group level aggregates forvalues i = 1/`numgroup' { // product of group i mat dirav_i = J(numprod[`i',1],1,co2_int[`i',1]) mat dirav = nullmat(dirav)\dirav_i // concatenating values for products of group i to emission intensities vector } display "Check: rows carbon costs from sectoral aggregates: ", rowsof(dirav) ** Direct emissions disturbance: Create a conforming vector with lognormal disturbance. ** The distribution paramweters of the underlying normal are mu = - sigma^2/2 and sigma. ** This translates to a lognormal distribution with expectation 1 and standard deviation sqr(exp(sigma^2) - 1) ** Sigma parameter for direct emissions is calibrated to the standard deviation of industry level mean deviation of GHG scope 1 intensity in Trucost. ** In principle, sigma could be computed industry specific **************************************************************************************************************** ** Setting sigma parameter for direct emissions ** Value from overall standard deviation (unweighted) of log direct emissions intensity in Trucost data **************************************************************************************************************** ** scalar dirsigma = 1.253336 // value for CPA 71 scalar dirsigma = .9273174 // value for tc primary sector, used and published in discussion paper version set seed 0 mat dirdist = matuniform(`sumprod', 1) forvalues i = 1/`sumprod' { scalar prob = dirdist[`i',1] scalar vnormal = invnormal(prob) scalar vlognormal = exp(vnormal * dirsigma - dirsigma^2/2) mat dirdist[`i',1] = vlognormal } *********************************************************************************************** ** Check: Compute empirical moments of disturbance to direct emission intensities *********************************************************************************************** scalar sumsquare = 0 scalar sum = 0 forvalues i = 1/`sumprod' { scalar sum = sum + dirdist[`i',1] scalar sumsquare = sumsquare + (dirdist[`i',1])^2 // summing up the squares } scalar average = sum/`sumprod' scalar var = sumsquare/`sumprod' - average^2 // variance of n^2 observations (Verschiebesatz) scalar sd = sqrt(var) // standard deviation di "------- Empirical moments of shocks to direct emissions (lognormal)----------------" di "Sample mean = ", average di "Sample variance = ", var di "Sample std = ", sd di "------- Theoretical moments ----------------" di "Mean = 1" di "Variance = ", exp(dirsigma^2) - 1 di "Standard deviation = ", sqrt(exp(dirsigma^2) - 1) scalar drop average sumsquare var sd ** Generate direct (scope 1) emissions from multiplying matrix of disturbances dist into expected outcomes from matrix av mat scope1 = hadamard(dirav,dirdist) ********************************************************************************************************* ** Matrix 1: Randomize products being used as inputs ********************************************************************************************************* ** Create an indicator matrix that indicates -- for each product -- one input of each product group ** This is done by creating a random matrix and choose the maximum value for each input group, by product ** In its columns, for each product j this matrix identifies one input per product group. ** Thus if numgroup is the number of groups, it has numgroup unity entries in each column set seed 0 matrix indic = J(`sumprod',`sumprod', 0) matrix rnd = matuniform(`sumprod', `sumprod') forvalues j = 1/`sumprod' { // loop over products local endgroup = 0 // initiate the beginning of product group and end of product group forvalues i = 1/`numgroup' { // loop over input product groups of product j scalar candidate = 0 local begingroup = `endgroup' + 1 // update beginning and ending of product group indices local endgroup = `endgroup' + numprod[`i',1] forvalues k = `begingroup'/`endgroup' { // loop over the indices of input product group i if rnd[`k',`j'] > candidate { // identify the maximum value -- this input will be used scalar candidate = rnd[`k',`j'] scalar index = `k' } } mat indic[index,`j'] = 1 // mark position of maximum in indicator matrix } } ** List submatrix mat sample = indic[1..20,1..20] mat list sample mat drop sample ********************************************************************************************************* ** Matrix 2: Expected values for product level input coefficients ********************************************************************************************************* ** Create BOM averages matrix holding product group averages for individual products ** The matrix is assembled by concatenating submatrices for product groups forvalues i = 1/`numgroup' { // BOM average coefficients for products of group i forvalues j = 1/`numgroup' { // products of group j as an input source mat av_ij = J(numprod[`j',1],numprod[`i',1], 1) * io_coeff[`j',`i'] mat av_i = nullmat(av_i)\av_ij // concatenating supplies from products of group j to BOM submatrix for i } mat av = nullmat(av),av_i // concatenating finished submatrix for i to overall BOM averages matrix mat drop av_i } ********************************************************************************************************* ** Matrix 3: Lognormal shock terms to requirement matrix ********************************************************************************************************* ** Create a conforming matrix with lognormal disturbances. ** Lognormal shock terms are always positive and can be interpreted as the outcome of a series of independent mutiplicative shocks ** The distribution paramweters of the underlying normal are mu = - sigma^2/2 and sigma. ** This translates to a lognormal distribution with expectation 1 and standard deviation sqr(exp(sigma^2) - 1) ** Parameter sigma relates to the standard deviation of input demand from industries. Otherwise, standard deviation of ** input demand will vary if the number of products vary, as the standard deviation of energy demand will increase with the root of the number of energy companies. ** Old calibration (discussion paper): ** We want to calibrate to the standard deviation of log mean deviations of scope 2 emissions in the TRUECOST data. Electricity is one input good. ** Scope 2 emissions may be assumed to be proportional to electricity demand. ** Calibrating to the standard deviation of a log has the advantage, that it is standardised and unit free: ** -- We do not need to transform Scope 2 emissions to electricity demand, ** -- There is no need to correct for currency denominations ** -- For the same reason we can keep financial industry and gross and retail trade in the sample, although their output is not well approximated by revenue ** In the general case, we need to calibrate the standard deviation of a sum, the sum of electricity providers. If the individual products are lognormally distributed, ** this is the standard deviation of the log of a sum of lognormals. There are three ways of making the linkage of sigma to the standard deviation of the ** log of a sum: ** 1) The dissertation of Brückner shows that the sum of lognormals is APPROXIMATELY lognormal itself. We can calculate the sd of the sum from ** standard formula. The log of this sum is approximately normal. We need to choose sigma such that the sd of this normal variate is equal to the ** SD in the Truecost data set ** 2) By numerical means (grid search) we can choose sigma such that the SD of the log of the sum is equal to the SD of log Scope 2 emissions in Trucost ** 3) (Here) We assume that there is but one electricity provider per firm. In this case, the standard deviation of disturbances can be directly calibrated to the standard deviation ** of the log mean deviation of scope 2 emissions. ** With more than one input per group we have to adjust the standard deviation of the products to number of products in the group, ** as with more products, the standard deviation will increase. ** General case: ** scalar sigma = sigmagroup * sqrt(numprod[8,1]) // Individual level sigma is scaled to the number of energy companies. // If the number of firms per sector doubles for all sectors, the // standard deviation of all groups -- not only the energy companies -- remains the same. // Attention: with the 12 sector IO table, the energy supply is in the same group // as water supply, sewage, and waste ** scalar sigma = 1.140992 ** scalar sigma = .2129105 ****************************************************************************************************** ** We do the simulation for different values of sigma ****************************************************************************************************** foreach item of numlist 1/8 { local sigma = (`item'-1)/20 // This is 0 for item = 1 and e.g. 0.6 for item = 13 ** Remark: We need an order number 'item' for labelling the saved simulation results -- using sigma as a label does not work because of the decimal point display "################################################################" display "################################################################" display "## SIMULATION -- parameterisation `item', sigma = `sigma' " display "################################################################" display "################################################################" ** Will not reset seed, because otherwise shocks to emission intensities (above) and to input requirements will be perfectly correlated mat dist = matuniform(`sumprod', `sumprod') forvalues i = 1/`sumprod' { forvalues j = 1/`sumprod' { scalar prob = dist[`i',`j'] scalar vnormal = invnormal(prob) scalar vlognormal = exp(vnormal*`sigma' - `sigma'^2/2) mat dist[`i',`j'] = vlognormal } } ** Check disturbance matrix display "columns dist: ", colsof(dist) display "rows dist: ", rowsof(dist) *********************************************************************************************** ** Check: Compute empirical moments of disturbance to direct emission intensities *********************************************************************************************** scalar sumsquare = 0 scalar sum = 0 forvalues i = 1/`sumprod' { scalar sum = sum + dist[`i',1] scalar sumsquare = sumsquare + (dist[`i',1])^2 // summing up the squares } scalar average = sum/`sumprod' scalar var = sumsquare/`sumprod' - average^2 // variance of n^2 observations (Verschiebesatz) scalar sd = sqrt(var) // standard deviation di "------- Empirical moments of shocks to direct emissions (lognormal)----------------" di "Sample mean = ", average di "Sample variance = ", var di "Sample std = ", sd di "------- Theoretical moments ----------------" di "Mean = 1" di "Variance = ", exp(`sigma'^2) - 1 di "Standard deviation = ", sqrt(exp(`sigma'^2) - 1) scalar drop average sumsquare var sd ********************************************************************************************************* ** Putting the three matrices together: create BOM matrix ********************************************************************************************************* ** Generate BOM from multiplying matrix of disturbances dist, conditional on indic, into expected outcomes from matrix av ** The input coeff for one product per group will be such that on average the input coeff of the group will be as given by the industry input coefficients mat shock = hadamard(indic,dist) mat bom = hadamard(av,shock) ** Check matrix display "columns BOM: ", colsof(dist) display "rows BOM: ", rowsof(dist) ** Check row averages -- 100 values forvalues i = 1/100 { scalar sum = 0 forvalues j = 1/`sumprod' { scalar sum = sum + dist[`i',`j'] } display sum/`sumprod' } ** List submatrix mat sample = bom[1..20,1..20] mat list sample mat drop sample matrix drop dist ** Compute eigenvalues of bom and check first 50 (taking into account leading zeros) mat eigenvalues re im = bom mat check = re[1,1..50] \ im[1, 1..50] mat list check ** The saved eigenvalues are not sorted according to their modulus ** Find the three eigenvalue with the largest modulus scalar first = 0 scalar second = 0 scalar third = 0 forvalues i = 1/`sumprod' { scalar eigen = sqrt(re[1,`i']^2 + im[1,`i']^2) if eigen > first { scalar third = second scalar second = first scalar first = eigen } else if eigen > second { scalar third = second scalar second = eigen } else if eigen > third { scalar third = eigen } } display "###############################################################################" display "The three eigenvalues with largest moduli: ", first," ", second, " ", third display "###############################################################################" scalar drop eigen first second third ************************************************************************** ** Create vector of weights for calculating totals ** Weights are based on total use, including imports ************************************************************************** ** Create vector that holds the total use of the group, divided by the number of products in the group forvalues i = 1/`numgroup' { // product of group i mat weight_i = J(numprod[`i',1],1,tot_use[`i',1]/numprod[`i',1]) mat weight = nullmat(weight)\weight_i // concatenating values for products of group i to emission intensities vector } display "Check: Number of weights ", rowsof(weight) ****************************************************************************** ** Check -- input use: Generate deviation of electricity demand in Euro from mean to compute std dev ** Note: in the 71 sector version, electricity demand comes together with air conditioning (warming and cooling), product group 29 ****************************************************************************** forvalues i = 1/`sumprod' { // vector is created for all products scalar sum = 0 forvalues j = 1/`sumprod' { // Check all inputs if prodgroup[`j',1] == 29 { scalar sum = sum + shock[`j',`i'] * scope1[`j',1] // This is the multiplicative shock in electricity demand: // requirement shock of firm times direct emission shock of electricity provider } } mat scope2_shock = nullmat(scope2_shock)\sum // Write the resulting sums into the appropriate element of the vector } ** Compute SD of deviation of total electricity demand from mean scalar sumsquare = 0 scalar sum = 0 forvalues i = 1/`sumprod' { scalar sum = sum + log(scope2_shock[`i',1]) scalar sumsquare = sumsquare + (log(scope2_shock[`i',1]))^2 // summing up the squares } scalar average = sum/`sumprod' scalar var = sumsquare/`sumprod' - average^2 // var of the n^2 obs (Verschiebesatz) scalar sd = sqrt(var) // standard deviation di "------- Shocks to log electricity intensity ----------------" di "Average shock: ", average di "Variance shock: ", var di "SD shock: ", sd scalar drop average scalar drop sumsquare scalar drop var ** Compute scope 2 emissions: This is electricity demand per euro of output, multiplied by direct emissions of energy providers. ** Note: In prior version, Scope 2 intensity was calculated as electricity demand per Euro of output times the carbon content of electricity ** In reality there may be more than one type of elecricity forvalues i = 1/`sumprod' { // Scope 2 emissions are computed for all products scalar sum = 0 forvalues j = 1/`sumprod' { // Look for electricity demand and sum up the associated carbon contents if prodgroup[`j',1] == 29 { scalar sum = sum + bom[`j',`i'] * scope1[`j',1] // In the simple version, there is but one type of electricity. // In general, this is a sum. } } mat scope2 = nullmat(scope2)\sum // Write the resulting sums into the appropriate element of the vector } ****************************************************************************************** ** III. COMPUTING LEONTIEF INVERSE AND CARBON CONTENTS ****************************************************************************************** ** Compute the Leontief inverse directly mat leontief = inv(I(`sumprod') - bom) ** Compute carbon costs mat cc = (scope1' * leontief)' ****************************************************************************************** ** IV. LOADING CARBON CONTENTS AND OTHER RESULTS INTO DATA PLATFORM TO DO GRAPHS AND STATS ****************************************************************************************** drop _all ** Load vector product groups svmat prodgroup, names(prod_group) // product group rename prod_group1 prod_group // Get rid of the subscript in group1 created by the matrix command svmat label var prod_group "Product group" label values prod_group prodgroups ** Product ID generate prod_id = _n order prod_id prod_group ** Product weight svmat weight rename weight1 weight label var weight "weights derived from total sector use and # products" ** Carbon contents from Leontief inverse svmat cc rename cc1 cc label var cc "Product carbon contents" ** Scope 1 emissions svmat scope1, names(sc1) rename sc11 sc1 label var sc1 "Scope 1 emissions" ** Scope 2 emissions svmat scope2, names(sc2) rename sc21 sc2 label var sc2 "Scope 2 emissions" ** Indirect emissions, computed as difference of carbon costs and direct emissions gen indir = cc - sc1 label var indir "Indirect emissions" ** Scope3 emissions, computed as difference of indirect emissions and scope 2 emissions gen sc3 = indir - sc2 label var sc3 "Scope 3 emissions" ** Create init1 as vector of initial estimates for product level carbon costs from group level values forvalues i = 1/`numgroup' { // product of group i mat init1_i = J(numprod[`i',1],1, cc_group[`i',1]) mat init1 = nullmat(init1)\init1_i // concatenating values for products of group i to emission intensities vector *display "sector: ", `i' } display "rows initial carbon contents 1 (industry values): ", rowsof(init1) ** Group level carbon contents as a reference svmat init1, names(cc_group) rename cc_group1 cc_group label var cc_group "Sector level carbon contents" ** Check carbon content ** How far are they from their industry level projections? sum cc, d gen diff = cc - cc_group sum diff, d ** Product group level tabstat cc cc_group diff, by(prod_group) stat(mean sd) mat list cc_group // This matrix is the industry level tabulation, not the product level assignment in the Stata variable with the same name ** There is a bias in the sector level carbon content estimations ** Individual level carbon contents are calculated as a matrix power expansion. This is a nonlinear convex function. ** The mean of such a function will be larger than the function of the mean. ** Create init_2 as a simple overall average of carbon contents scalar av_cc = 0 forvalues i = 1/`sumprod' { scalar av_cc = av_cc + init1[`i',1] } scalar av_cc = av_cc/`sumprod' mat init2 = J(`sumprod',1,1) * av_cc display "rows initial carbon costs 2 (overall average): ", rowsof(init2) ** Tabulation sum cc sc1 sc2 indir, d tabstat cc sc1 sc2 indir, by(prod_group) stat(mean sd) ** Attention: Additional tables and a graph in 30_Tabulation.do save "$sim_out\carbon_content_`item'", replace ***************************************************************************************************************** ** V. DO A LOOP FOR TWO DIFFERENT INITIAL VALUES FOR CARBON CONTENTS ***************************************************************************************************************** foreach init in init1 init2 { display "################################################################" display "## Evolution in loop -- parameterisation `item', sigma = `sigma' " display "## Initial value: `init' " display "################################################################" keep prod_id prod_group cc ** Compute carbon costs by expansion ** Set intitial values mat itcc = `init' mat pathcc = `init' mat itcc_lag = itcc di di "Computation of learning process for`init'" forvalues i = 1/20 { di di "Iteration ", `i' matrix itcc = (scope1' + itcc' * bom)' di "Relative difference to iteration i-1: ", mreldif(itcc, itcc_lag) mat itcc_lag = itcc mat pathcc = pathcc,itcc di } di di "Relative difference to true carbon contents: ", mreldif(itcc, cc) ** Path of individual level carbon content estimation svmat pathcc, names(it) ** Looking at convergence di "Tracking learning process for `init'" tabstat it1 - it21, by(prod_group) forvalues i = 1/21 { gen abs`i'= abs(it`i' - cc) } forvalues i = 1/21 { gen relabs`i'= abs`i'/cc } save "$sim_out\trace_param_`item'_`init'", replace ***************************************************************** ** Prepare and save graph of adjustment process ***************************************************************** di di "Absolute and relative deviations for `init'" if "`init'" == "init1" { local initlabel = "informative prior" } if "`init'" == "init2" { local initlabel = "uninformative prior" } ** Prepare absolute deviations for analysis di di "1) Absolute deviations for `init'" tabstat abs1 - abs10, by(prod_group) save ** Transposing in order to write down to variable editor and plotting forvalues i = 1/`numgroup' { matrix adj_`init' = nullmat(adj_`init'),r(Stat`i')' } matrix adj_`init' = adj_`init', r(StatTotal)' ** Prepare relative deviations for analysis di di "2) Relative deviations for `init'" tabstat relabs1 - relabs10, by(prod_group) save ** Transposing in order to use time series plotting capabilities forvalues i = 1/`numgroup' { matrix reladj_`init' = nullmat(reladj_`init'),r(Stat`i')' } matrix reladj_`init' = reladj_`init', r(StatTotal)' ** Writing deviations data to variable editor preserve drop _all ** Generate panel of absolute deviations over "time" from the table svmat adj_`init', names(m) rename m72 total_adj ** Generate panel of relative deviations over "time" from the table svmat reladj_`init', names(r) rename r72 total_reladj ** Generate "time" variable gen t = _n-1 tsset t ** Plotting absolute deviations as time series di "Plotting absolute deviations for `init'" twoway /* */ (tsline m1) (tsline m2) (tsline m3) (tsline m4) (tsline m5) (tsline m6) (tsline m7) (tsline m8) (tsline m9) (tsline m10) /* */ (tsline m11) (tsline m12) (tsline m13) (tsline m14) (tsline m15) (tsline m16) (tsline m17) (tsline m18) (tsline m19) (tsline m20) /* */ (tsline m21) (tsline m22) (tsline m23) (tsline m24) (tsline m25) (tsline m26) (tsline m27) (tsline m28) (tsline m29) (tsline m30) /* */ (tsline m31) (tsline m32) (tsline m33) (tsline m34) (tsline m35) (tsline m36) (tsline m37) (tsline m38) (tsline m39) (tsline m40) /* */ (tsline m41) (tsline m42) (tsline m43) (tsline m44) (tsline m45) (tsline m46) (tsline m47) (tsline m48) (tsline m49) (tsline m50) /* */ (tsline m51) (tsline m52) (tsline m53) (tsline m54) (tsline m55) (tsline m56) (tsline m57) (tsline m58) (tsline m59) (tsline m60) /* */ (tsline m61) (tsline m62) (tsline m63) (tsline m64) (tsline m65) (tsline m66) (tsline m67) (tsline m68) (tsline m69) (tsline m70) /* */ (tsline m71)/* */ (tsline total_adj), ytitle(mean abs(cc(t) - cc)) ttitle(# iterations) title(Convergence of PCC by product group) /* */ subtitle(Mean absolute error / `initlabel' / sigma: `sigma') legend(off) graph save "Graph" "$sim_out\Adjust_71_`item'_`init'", replace ** Plotting absolute deviations relative to carbon costs as time series di "Plotting absolute deviations relative to carbon content for `init'" twoway /* */ (tsline r1) (tsline r2) (tsline r3) (tsline r4) (tsline r5) (tsline r6) (tsline r7) (tsline r8) (tsline r9) (tsline r10) /* */ (tsline r11) (tsline r12) (tsline r13) (tsline r14) (tsline r15) (tsline r16) (tsline r17) (tsline r18) (tsline r19) (tsline r20) /* */ (tsline r21) (tsline r22) (tsline r23) (tsline r24) (tsline r25) (tsline r26) (tsline r27) (tsline r28) (tsline r29) (tsline r30) /* */ (tsline r31) (tsline r32) (tsline r33) (tsline r34) (tsline r35) (tsline r36) (tsline r37) (tsline r38) (tsline r39) (tsline r40) /* */ (tsline r41) (tsline r42) (tsline r43) (tsline r44) (tsline r45) (tsline r46) (tsline r47) (tsline r48) (tsline r49) (tsline r50) /* */ (tsline r51) (tsline r52) (tsline r53) (tsline r54) (tsline r55) (tsline r56) (tsline r57) (tsline r58) (tsline r59) (tsline r60) /* */ (tsline r61) (tsline r62) (tsline r63) (tsline r64) (tsline r65) (tsline r66) (tsline r67) (tsline r68) (tsline r69) (tsline r70) /* */ (tsline r71)/* */ (tsline total_reladj), ytitle(mean abs(cc(t) - cc)/cc) ttitle(# iterations) title(Convergence of PCC by product group) /* */ subtitle(Mean absolute error relative to true CC / `initlabel' / sigma: `sigma') legend(off) graph save "Graph" "$sim_out\Reladjust_71_`item'_`init'.gph", replace save "$sim_out\Abs_error_graph_data_`item'_`init'", replace restore } ** Drop matrices generated in the loop -- we are using accretion! matrix drop bom shock im re leontief cc scope2 scope2_shock weight weight_i init1 init2 init1_i itcc itcc_lag pathcc adj_init1 adj_init2 reladj_init1 reladj_init2 } *************************************************************************** ** Bye... *************************************************************************** log close exit