Writing (or ridiculing, depending on your persepective) about american college admission system was on my agenda. Now, I probably won't because there is a better article.
http://www.newrepublic.com/article/114848/college-admissions-criteria-american-vs-british
We think we know. Actually, we just happen to know with some confidence, yet never with full certainty. We do not know with certainty, who we are, what we can accomplish or what we will become. We never know what will happen to us, who are the people around us. Life is the journey where we grapple with these uncertainties, and try to understand these uncertainties.
Monday, September 30, 2013
Sunday, September 29, 2013
About Models
I did not intend to write this, but I felt I need to after reading a comment to my post.
There are two kinds of models. One kind of model is to make a point, or formalize an insight. The other kind of model is used to model the reality closely, and they are intended to be brought to empirical tests.
The first kind of model are designed to be as simple as possible, but not simpler. The second kind is designed to be as realistic as possible. We calibrate the parameters, and estimate the distributions if there is stochastic elements. Each parameter or variable has a direct implications---none of which is true for the first model.
I probably will only discuss and present the first kind of model in my blogs (I tend to prefer them in research as well). So what to make of my model? There are some basic take-aways, and that is why I wrote the model down. There are also many conclusions specific to the model assumptions, do not pay attention to them. They are not likely to be robust. The general rule is that do not be religious of the numbers I generate, and they depend on model assumptions heavily. Nevertheless, I try to present robust qualitative results.
There are tons of criticisms that I could leash out against normal distribution. But I will use them anyway. It is unlikely it gonna change the basic idea, and it is convenient.
I will end with a semi-relevant anecdote. In class one day, professor Helpman decided to present a general model of multiple goods. There is one "theoretic" guy in class and he asked:" Why don't we make the model more realistic and generalize to a continuum of goods?" Not a super impressive question. With finite numbers of molecules in the universe (yes, a large number), "continuum" is is not more realistic than "general N".
There are two kinds of models. One kind of model is to make a point, or formalize an insight. The other kind of model is used to model the reality closely, and they are intended to be brought to empirical tests.
The first kind of model are designed to be as simple as possible, but not simpler. The second kind is designed to be as realistic as possible. We calibrate the parameters, and estimate the distributions if there is stochastic elements. Each parameter or variable has a direct implications---none of which is true for the first model.
I probably will only discuss and present the first kind of model in my blogs (I tend to prefer them in research as well). So what to make of my model? There are some basic take-aways, and that is why I wrote the model down. There are also many conclusions specific to the model assumptions, do not pay attention to them. They are not likely to be robust. The general rule is that do not be religious of the numbers I generate, and they depend on model assumptions heavily. Nevertheless, I try to present robust qualitative results.
There are tons of criticisms that I could leash out against normal distribution. But I will use them anyway. It is unlikely it gonna change the basic idea, and it is convenient.
I will end with a semi-relevant anecdote. In class one day, professor Helpman decided to present a general model of multiple goods. There is one "theoretic" guy in class and he asked:" Why don't we make the model more realistic and generalize to a continuum of goods?" Not a super impressive question. With finite numbers of molecules in the universe (yes, a large number), "continuum" is is not more realistic than "general N".
Saturday, September 28, 2013
The Dirty Secrets of Achievement
Nowadays people are obsessed with achievements. What school we get in, what jobs we get, how much money we get, and how famous we are.
Every time some famous or achieved person talks, there will be disciples eagerly listening to words of wisdom and religiously put them into use.
Let us be skeptical. I have written about prosecutor's fallacy in my post Official Nonsense. In essence, while the probability of success conditioning on a good strategy/talent is high, the probability of good strategy/talent conditioning on success might not be high. In other words, we could not infer that much from success. What they did might not get you anywhere.
Let me do a calibration. Suppose there are four strategies: a,b, c, d. You can also think there are four talent groups. Each individual in the group will produce a random outcome following a normal distribution. The mean (average level) and standard deviation (dispersion) will be different for each groups.For group A, it is N(3,1), B---N(2,2), C---N(1,3), D---N(0,4). In other words, in expectation, group A is the best, not just in terms of average outcome, but also in terms of risks. Group A has the lowest risks. From a mean-variance analysis point of view, group A dominates the other groups. No question. However, when we do the simulation, and we wish to calculate, among those in 90, 95 and 99 percentiles, how much of them come from each group. Here are my results.
90 95 99
A 0.0025 0 0
B 0.0525 0 0
C 0.1050 0 0
D 0.8400 1.0000 1.0000
Every time some famous or achieved person talks, there will be disciples eagerly listening to words of wisdom and religiously put them into use.
Let us be skeptical. I have written about prosecutor's fallacy in my post Official Nonsense. In essence, while the probability of success conditioning on a good strategy/talent is high, the probability of good strategy/talent conditioning on success might not be high. In other words, we could not infer that much from success. What they did might not get you anywhere.
Let me do a calibration. Suppose there are four strategies: a,b, c, d. You can also think there are four talent groups. Each individual in the group will produce a random outcome following a normal distribution. The mean (average level) and standard deviation (dispersion) will be different for each groups.For group A, it is N(3,1), B---N(2,2), C---N(1,3), D---N(0,4). In other words, in expectation, group A is the best, not just in terms of average outcome, but also in terms of risks. Group A has the lowest risks. From a mean-variance analysis point of view, group A dominates the other groups. No question. However, when we do the simulation, and we wish to calculate, among those in 90, 95 and 99 percentiles, how much of them come from each group. Here are my results.
90 95 99
A 0.0025 0 0
B 0.0525 0 0
C 0.1050 0 0
D 0.8400 1.0000 1.0000
The results are striking! Most of the successful people come from group D! and all of the super-successful come from group D!
This simulation uses 1000 people in each group.
Let us be robust, what about only 100 people in each group? Here is the result:
90 95 99
A 0 0 0
B 0.0250 0 0
C 0.1000 0 0
D 0.8750 1.0000 1.0000
The moral of the study is NOT that people who are successful took the inferior path. This simulation only cautions against inferring too much from outcomes. The reason I used different means for each group is to demonstrate that, given enough risks, you could expect people to stand out even if they are not the type of people who would normally succeed. It is the triumph of risks in ex post observation against expectation.
Basically the take-away is : worship is unnecessary, and above all, do not worship yourself.
I have attached the Matlab code I used in case curious readers wanna play with different specifications. It is quite robust.
rng(1)
a=normrnd(3,1,100,1);
b=normrnd(2,2,100,1);
c=normrnd(1,3,100,1);
d=normrnd(0,16,100,1);
complete=[a;b;c;d];
achievement90=quantile(complete,0.9);
achievement95=quantile(complete,0.95);
achievement99=quantile(complete,0.99);
m=[a';b';c';d'];
summary=[mean((m>achievement90),2)/0.4,mean((m>achievement95),2)/0.2,mean((m>achievement99)/0.04,2)]
a=normrnd(3,1,1000,1);
b=normrnd(2,2,1000,1);
c=normrnd(1,3,1000,1);
d=normrnd(0,16,1000,1);
complete=[a;b;c;d];
achievement90=quantile(complete,0.9);
achievement95=quantile(complete,0.95);
achievement99=quantile(complete,0.99);
m=[a';b';c';d'];
summary=[mean((m>achievement90),2)/0.4,mean((m>achievement95),2)/0.2,mean((m>achievement99)/0.04,2)]
Friday, September 20, 2013
about grad school
so an update on grad school.
So far so good, despite the fact that I got sick twice already. I am extremely happy with where I am, with whom I am. It is easy to say this since the winter has not yet come, but we all know that "The winter is coming. It gonna be long".
I really like the people. I actually now enjoy going to my office, because I get to see some people I enjoy working with. I actually wish I could collaborate with them in some future work. Great thing about Cambridge is that I just have more friends to hang out with here, and that actually makes a much larger difference than I anticipated.
I like the program. It is super flexible which is a big plus for me and I am only taking classes I am interested in. Fabulous class so far---I hope I can still say this after my first Matlab exercise. I had the great fortune of having Daniel as my TF. For one thing it is great that I can actually get to meet him regularly, for another, he is just so good at teaching, which is rare among grad students.
I really enjoy being a grad student. Yeah, I get paid very little--compared to the outside jobs. (This is answer to a question posed by some people how can academia pay so little without worrying it does not attract people). I feel I am learning interesting things everyday, and I get insights very often. I constantly have ideas, some are crappy, some are great. Among the great ones, I often find them already published. But there are survivors, and I look forward to working on them after I am done with coursework.
There is a lot I wish to write about, but I am a little behind on work due to my untimely and frequent sickness. But here is a list of things I am hoping to write about:
1. myopic tracking and incentive distortion.
2. internal consistency and rational expectation---when necessary?
3. incentivized dogmatism
4. emperics is king.
So far so good, despite the fact that I got sick twice already. I am extremely happy with where I am, with whom I am. It is easy to say this since the winter has not yet come, but we all know that "The winter is coming. It gonna be long".
I really like the people. I actually now enjoy going to my office, because I get to see some people I enjoy working with. I actually wish I could collaborate with them in some future work. Great thing about Cambridge is that I just have more friends to hang out with here, and that actually makes a much larger difference than I anticipated.
I like the program. It is super flexible which is a big plus for me and I am only taking classes I am interested in. Fabulous class so far---I hope I can still say this after my first Matlab exercise. I had the great fortune of having Daniel as my TF. For one thing it is great that I can actually get to meet him regularly, for another, he is just so good at teaching, which is rare among grad students.
I really enjoy being a grad student. Yeah, I get paid very little--compared to the outside jobs. (This is answer to a question posed by some people how can academia pay so little without worrying it does not attract people). I feel I am learning interesting things everyday, and I get insights very often. I constantly have ideas, some are crappy, some are great. Among the great ones, I often find them already published. But there are survivors, and I look forward to working on them after I am done with coursework.
There is a lot I wish to write about, but I am a little behind on work due to my untimely and frequent sickness. But here is a list of things I am hoping to write about:
1. myopic tracking and incentive distortion.
2. internal consistency and rational expectation---when necessary?
3. incentivized dogmatism
4. emperics is king.
Monday, September 16, 2013
Some ranting against technology
This is not going to be a very coherent argument as I still have fever. I will revise this soon.
So there is the trend to tech things up---the most extreme being "no child left untableted". I am personally against that. For one thing there is no evidence that the marginal benefit of introducing technology justifies the cost---which consists for government expenditure, (which in turn includes staffing cost) and deadweight loss associated with taxation needed for such expenditures. To be honest, I doubt if there is any marginal benefit for kids introduced with all these fancy stuff.
I am a reluctant technology user---by reluctant I mean, I feel I have taken up say smartphones because it is a better option for me, given that everyone else is using them. However, my overall satisfaction has fallen dramatically. I would be much better off were smartphones not introduced at all.
By imposing technology on kids so early on, we are only reinforcing this effect.
Ultimately, I wish to retire the use of smartphones, then emails, like Donal Knuth.
So there is the trend to tech things up---the most extreme being "no child left untableted". I am personally against that. For one thing there is no evidence that the marginal benefit of introducing technology justifies the cost---which consists for government expenditure, (which in turn includes staffing cost) and deadweight loss associated with taxation needed for such expenditures. To be honest, I doubt if there is any marginal benefit for kids introduced with all these fancy stuff.
I am a reluctant technology user---by reluctant I mean, I feel I have taken up say smartphones because it is a better option for me, given that everyone else is using them. However, my overall satisfaction has fallen dramatically. I would be much better off were smartphones not introduced at all.
By imposing technology on kids so early on, we are only reinforcing this effect.
Ultimately, I wish to retire the use of smartphones, then emails, like Donal Knuth.
Thursday, September 12, 2013
On microfoundation
There are two ways we can model the whole macroeconomy. We can model the aggregates directly, like aggregate consumption, price, and so forth. Alternatively, we could build from the bottom up, we start from people's decision making---solving their optimization problem, and then model their interaction, and finally the whole market. The second approach is a multi-layered approach in that in each layer you specify the rule, you study each layer separately, from the bottom up, and using the result you solved for in the previous layer as input. The idea of micro-founded economics is illustrated in the following diagram.
A digression for the super serious: what the above diagram illustrates is a more hierarchical model where different groups of individuals make decisions sequentially. In reality, and in literature, different groups often make decisions simultaneously, and yet taking into account what other group will be doing (and other group will take into account of what that group is doing as well in an infinite loop) (for those who know game theory, we are looking at Nash Equilibrium).
So an obvious observation is that the second approach is so complicated. After all, why bother to study from the bottom up if all you are interested is the top level? Indeed, economists have taken the first approach for a long time, until comes along the famous Lucas Critique:
So we have turned to the second approach, with lots of added effort. The extra effort does not mean we are golden. Quite the contrary. The problem is when we model each layer, it will be highly imprecise. All those imprecision adds up and amplify each other, and when we get to the top, it is nothing like the the real world---that is how we get highly unrealistic models that very few has faith in. The standard approach in economics is that calibrate our model----find parameters (the very first inputs) to match our data. Because of the problem I mentioned, to fit the macro data, our estimates of the parameters would be non-sensible in the micro level. Not surprisingly, micro-level and macro-level data often give us highly different estimates.
Another Digression for the super-serious: Imprecision is a bad word choice as it suggests a mere error term that would be dealt with in statistical analysis. Instead, I mean the deviations of structures from the reality. For example, imposing homothetic utility does not just generate random errors. Another point is that, we can always tweak our models so that it will fit micro-level and macro-level data. But are we just telling stories so that the data will fit, or are we actually making the first several layers more precise? In some sense, the purpose deviates from the true spirit of micro-foundation, to spinning a story that resembles micro-foundation.
An example to illustrate. When I was young, I like to play with ship models. I liked those fancy ones where you build from very tiny parts. Those require very careful work, but I was clumsy. When I put together those little pieces in a less than perfect manner, I could no longer fit that giant chunk into the frame of the ship. I used brute force, squeezing here and there until it fits, more or less. But when I looked back at those little parts, it is so distorted now. Several parts were displaced, some parts fell off, and it was just a mess. I think that is very similar to the mess economists are dealing with.
While my modeling experience could be improved via better craftsmanship, the modeling experience of economists are in some sense hopeless----To mathematically model these layers, things get so complicated. To make things tractable, we have to make numerous simplifying assumptions, not all of which are innocuous. Further more, by imposing those assumptions, we limit the type of questions we could ask as well.
So which approach is better? I do not think there is a definitive answer yet, but I want to contribute my two cents. The critique against the first is that it is not robust to policy changes, but it could be fine in the absence of policy change/paradigm shift. The second approach is aimed to address this inadequacy. But it fails the job---the changes in policy/environment pass through the model highly distorted due to the imprecisions added at each layer.
Admittedly, if we take the first approach, we can be a scientist, but we can never be an engineer---the very attempt of engineering would destroy our model. However, could we ever aspire to be an engineer in those fields? There are always things we do not model, and those things might be irrelevant in the previous world, but matter a lot in an engineered world. If you think you can fine-tune the economy with DSGE, you are guilty of the pretense of knowledge. As a conclusion, I want to be provocative and say that mathematical thinking and careful judgements+intuition are not substitutes, and they as a whole make up rigorous thinking (though some arrogant people tend to view mathematical thinking is necessary and sufficient for rigor).
After Note: Micro-foundation is the mainstream in macroeconomics, or any branch of economics these days. I thought about this for a long time, but never wrote much about it. This blog came out of a discussion I had with a friend, and I feel I need to write it down to invite further discussion.
An Illustration of Microfoundation |
So an obvious observation is that the second approach is so complicated. After all, why bother to study from the bottom up if all you are interested is the top level? Indeed, economists have taken the first approach for a long time, until comes along the famous Lucas Critique:
"Given that the structure of an econometric model consists of optimal decision rules of economic agents, and that optimal decision rules vary systematically with changes in the structure of series relevant to the decision maker, it follows that any change in policy will systematically alter the structure of econometric models."What in essence Lucas Critique says is that, when we model the top layer and decide to engineer some policy to bring out better outcomes, we have changed the environment the individuals face---and that change will pass through all those layers into the top layer---which means that our old model of the top layer becomes invalid. The most famous illustration is the Philipps Curve.
So we have turned to the second approach, with lots of added effort. The extra effort does not mean we are golden. Quite the contrary. The problem is when we model each layer, it will be highly imprecise. All those imprecision adds up and amplify each other, and when we get to the top, it is nothing like the the real world---that is how we get highly unrealistic models that very few has faith in. The standard approach in economics is that calibrate our model----find parameters (the very first inputs) to match our data. Because of the problem I mentioned, to fit the macro data, our estimates of the parameters would be non-sensible in the micro level. Not surprisingly, micro-level and macro-level data often give us highly different estimates.
Another Digression for the super-serious: Imprecision is a bad word choice as it suggests a mere error term that would be dealt with in statistical analysis. Instead, I mean the deviations of structures from the reality. For example, imposing homothetic utility does not just generate random errors. Another point is that, we can always tweak our models so that it will fit micro-level and macro-level data. But are we just telling stories so that the data will fit, or are we actually making the first several layers more precise? In some sense, the purpose deviates from the true spirit of micro-foundation, to spinning a story that resembles micro-foundation.
An example to illustrate. When I was young, I like to play with ship models. I liked those fancy ones where you build from very tiny parts. Those require very careful work, but I was clumsy. When I put together those little pieces in a less than perfect manner, I could no longer fit that giant chunk into the frame of the ship. I used brute force, squeezing here and there until it fits, more or less. But when I looked back at those little parts, it is so distorted now. Several parts were displaced, some parts fell off, and it was just a mess. I think that is very similar to the mess economists are dealing with.
While my modeling experience could be improved via better craftsmanship, the modeling experience of economists are in some sense hopeless----To mathematically model these layers, things get so complicated. To make things tractable, we have to make numerous simplifying assumptions, not all of which are innocuous. Further more, by imposing those assumptions, we limit the type of questions we could ask as well.
So which approach is better? I do not think there is a definitive answer yet, but I want to contribute my two cents. The critique against the first is that it is not robust to policy changes, but it could be fine in the absence of policy change/paradigm shift. The second approach is aimed to address this inadequacy. But it fails the job---the changes in policy/environment pass through the model highly distorted due to the imprecisions added at each layer.
Admittedly, if we take the first approach, we can be a scientist, but we can never be an engineer---the very attempt of engineering would destroy our model. However, could we ever aspire to be an engineer in those fields? There are always things we do not model, and those things might be irrelevant in the previous world, but matter a lot in an engineered world. If you think you can fine-tune the economy with DSGE, you are guilty of the pretense of knowledge. As a conclusion, I want to be provocative and say that mathematical thinking and careful judgements+intuition are not substitutes, and they as a whole make up rigorous thinking (though some arrogant people tend to view mathematical thinking is necessary and sufficient for rigor).
After Note: Micro-foundation is the mainstream in macroeconomics, or any branch of economics these days. I thought about this for a long time, but never wrote much about it. This blog came out of a discussion I had with a friend, and I feel I need to write it down to invite further discussion.
Sunday, September 8, 2013
an interesting comment
I saw an especially interesting comment today on facebook. Let me quote it in full.
i wish i could say this only applied to bowdoin:
What does Bowdoin not teach? Intellectual modesty. Self-restraint. Hard work. Virtue. Self-criticism. Moderation. A broad framework of intellectual history. Survey courses.English composition. A course on Edmund Spenser. A course primarily on the American Founders. A course on the American Revolution. The history of Western civilization from classical times to the present. A course on the Christian philosophical tradition. Public speaking. Tolerance towards dissenting views. The predicates of critical thinking. A coherent body of knowledge. How to distinguish importance from triviality. Wisdom. Culture.
btw, here's some of what they do teach: Queer Gardens, Beyond Pocahontas: Native American Stereotypes; Sexual Life of Colonialism; Modern Western Prostitutes.
Explores how the garden in Western literature and art serves as a space for desire. Pays special attention to the link between gardens and transgression. Also considers how gardens become eccentric spaces and call into question distinctions between nature and culture. Examines the work of gay and lesbian gardeners and traces how marginal identities find expression in specific garden spaces. Reconsiders one of the founding myths of Western culture: the idea of a lost Eden. Authors and gardeners may include Marvell, Lanyer, Pope, Seward, Dickinson, Burnett, Carroll, Sackville-West, Nichols, Jarman, and Pollan.
Are we ready for knowledge?
Knowledge is a huge burden. Without it, we act on default. When we know about the consequences of default however, it is impossible to continue act on default without debating over alternative actions, which inevitably involve moral judgement and clashes over value systems.
Let me give a concrete examples. Studies have shown that when girls and boys are educated in the same class for maths and science, girls learn worse than when they are taught separately. However, boys learn better in a co-ed system. The problem is shall we bring them apart? Some would say yes, arguing it is unacceptable that allow boys do well at the expense of girls' suffering. Well, that is one argument, but this argument uses the reference of boys and girls studying separately. However, if our reference is co-ed, and we consider the alternative, we might ask, is it ethical to let boys do worse so as to make girls learn better?
When we know nothing, choice does not carry a moral consequence. Ignorance is amoral, not immoral. However, when we know what each option would entail, it is just tough. Anyone who tries to make an overly simplistic picture for his own agenda, is a demagogue at best, most likely a dangerous man, and a devil at worst.
As a future researcher, I could just focus on positive analysis, and leave normative analysis to others. However, I have to constantly remind myself, that we live in a dangerous world, and positive analysis could easily abused to benefit an ambitious and evil group.
Let me give a concrete examples. Studies have shown that when girls and boys are educated in the same class for maths and science, girls learn worse than when they are taught separately. However, boys learn better in a co-ed system. The problem is shall we bring them apart? Some would say yes, arguing it is unacceptable that allow boys do well at the expense of girls' suffering. Well, that is one argument, but this argument uses the reference of boys and girls studying separately. However, if our reference is co-ed, and we consider the alternative, we might ask, is it ethical to let boys do worse so as to make girls learn better?
When we know nothing, choice does not carry a moral consequence. Ignorance is amoral, not immoral. However, when we know what each option would entail, it is just tough. Anyone who tries to make an overly simplistic picture for his own agenda, is a demagogue at best, most likely a dangerous man, and a devil at worst.
As a future researcher, I could just focus on positive analysis, and leave normative analysis to others. However, I have to constantly remind myself, that we live in a dangerous world, and positive analysis could easily abused to benefit an ambitious and evil group.
Subscribe to:
Posts (Atom)