In the last few months, I’ve posted links to a few articles with related implications:
- College students commonly choose majors based on unrealistic expectations about what careers they will lead to. For example, freshmen Journalism majors greatly overestimate the likelihood of becoming famous columnists or newscasters, and greatly underestimate the odds of becoming low-paid, low-status workers at small town newspapers.
https://marginalrevolution.com/marginalrevolution/2022/10/do-students-choose-majors-rationally.html - Computers are better at making college admissions decisions and workplace hiring decisions than human experts are: Students admitted on the advice of machines tend to have higher GPAs afterward, and workers hired by machines tend to go on to have more successful careers (faster promotions, higher salaries).
https://www.siopsa.org.za/wp-content/uploads/2020/06/Kuncel-Klieger-Connelly-Ones-2013-Mech-v-Clinical.pdf - Here’s a speculative article about a future system of education involving “hyper-individualized learning” and “micro-credentials” that would replace the old four-year college degree model. Computers would closely track each person’s knowledge base and skills set, and would provide them with short training programs to refresh important things, fill knowledge/skills gaps, and to incrementally improve their performance to suit the changing needs of their existing jobs or the different needs of new jobs they had just gotten.
https://futuristspeaker.com/future-of-education/hyper-individualized-learning-for-a-hyper-individualized-future/
In summary, when it comes to picking fields of study and work, humans are bad at doing it for themselves, bad at doing it for each other, and would be better off entrusting their fates to computers. While this sounds shocking, it shouldn’t be surprising–nothing in our species’ history has equipped us with the ability to perform these tasks well.
Consider that, for the first 95% of the human species’ existence, there was no such thing as career choice or academic study. We lived as nomads always on the brink of starvation, and everyone spent their time hunting, gathering, or caring for children. Doing anything else for a living was inconceivable. People found their labor niches and social roles in their communities through trial-and-error or sometimes through favoritism, and each person’s strengths and weaknesses were laid bare each day. Training and education took the form of watching more experienced people do tasks in front of you and gradually learning how to do them yourself through hands-on effort. The notion of dedicating yourself to some kind of study or training that wouldn’t translate into a job still payoff for years was inconceivable.
For the next 4.9% of our species’ existence, more career options existed, but movement between them was rare and very hard. Men typically did what their fathers did (e.g. – farmer, merchant, blacksmith), and breaking into many career fields was impossible thanks to restrictions on social class, race, or ethnicity. For example, a low-caste Indian was forbidden to become a priest, and a black American was forbidden admission to medical school. Women were usually prohibited from working outside the home, and so had even less life choice than men. The overwhelming majority of people had little or no access to information or ability to direct their courses of their own lives.
Only in the last 200 years, or 0.1% of our species’ existence, have non-trivial numbers of humans gained the ability to choose their own paths in life. The results have been disappointing in many ways. Young people, who are naturally ill-equipped to make major life choices for themselves, invest increasingly large amounts of time and money pursuing higher education credentials that turn out to not align with their actual talents, and/or that lead to underwhelming jobs. In the U.S., this has led to widespread indebtedness among young adults and to a variety of toxic social beliefs meant to vent their feelings of aggrievement and to (incorrectly) identify the causes of such early life struggles and failures.
The fact that we’re poor at picking careers, as evidenced by two of the articles I linked to earlier and by a vast trove of others you can easily find online, isn’t surprising. As I showed, nothing in our species’ history has equipped us with the skills to satisfactorily choose jobs for ourselves or other people. This is because nowhere near enough time has passed for natural selection to gift us with the unbiased self-insight and other cognitive tools we would need to do it well. If choosing the right field of study and career led to a person having more children than average, then the situation will be different after, say, ten more generations have passed.
Ultimately, most people end up “falling into” jobs that they are reasonably competent to perform and for which they have modest levels of passion, a lucky few end up achieving their childhood dreams, and an unlucky few end up chronically unemployed or saddled with jobs they hate. (I strongly suspect these outcomes have a bell curve distribution.)
As I said, the primary reason for this is that humans are innately mediocre judges of their own talents and interests, and are not much better grasping the needs of the broader economy so they can pursue careers likely to prosper. In the U.S. I think the problem is particularly bad due to the Cult of Self-Esteem and related things like rampant grade inflation and the pervasive belief that anyone can achieve anything through hard work. There aren’t enough reality checks in the education system anymore, too many powerful people (i.e. – elected politicians, education agency bureaucrats, and college administrators) have vested interests in perpetuating the current dysfunctional higher education system, and our culture has not come around to accepting the notion that not everyone is cut out for success and that it’s OK to be average (or even below average).
And I don’t know if this is a particularly American thing, but the belief that each person has one, true professional calling in life, and that they will have bliss and riches if only they can figure out what it is, is also probably wrong and leads people astray. A person might be equally happy in any one of multiple career types. And at the opposite end of the spectrum are people who have no innate passions, or who are only passionate about doing things that can’t be parlayed into gainful employment, like a person who absolutely loves writing poetry, but who also writes poor-quality poetry and lacks the aptitude and creativity to improve it.
Considering all the problems, letting computers pick our careers for us should be the default option! After all, if you’re probably going to end up with an “OK” career anyway that represents a compromise between your skills and interests and what the economy needs, why not cut out the expensive and stressful years of misadventures in higher education by having a machine directly connect you with the job? No high school kid has ever felt passionate about managing a warehouse, yet some of them end up filling those positions and feeling fully satisfied.
Such a computer-based system would involve assigning each human an AI monitor during their childhood. Each person would also take a battery of tests measuring traits like IQ, personality traits, and manual dexterity during their teen years, performed multiple times to compensate for “one-off” bad test results. Machines would also interview each teen’s teachers and non-parent relatives to get a better picture of what they were suited for. (I’m resistant to relying on the judgements of parents because, while they generally understand their children’s personalities very well, their opinions about their children’s talents and potential are biased by emotion and pride. Most parents don’t want to hurt the feelings of their children, want to live vicariously through them, and like being able to brag to other people about their children’s accomplishments. For those reasons, few parents will advise their children to pursue lower status careers, even if they know [and fear] that that is what they are best suited for. )
After compiling an individual profile, the computer would recommend a variety of career fields and areas of study that best utilize the person’s existing and latent talents, with attention also paid to their areas of interest and to the needs of the economy. At age 18, the person would be enrolled in work-study programs where they would have several years to explore all of the options. It would be a more efficient and natural way to place people into jobs than our current higher education system. By interning at the workplaces early on, young adults would get an unadulterated view of important factors like work conditions and pay.
And note that, even among highly successful people today, it’s common for their daily work duties to make little or even no use of what they learned in their higher education courses. Some argue that a four-year college degree is merely a glorified way of signaling to employers that you have a higher than average IQ and can stick to work tasks and get along with peers in pseudo-work settings reasonably well. Instead of charging young people tens or hundreds of thousands of dollars for those certifications, why not do it earlier, less obtrusively, and much cheaper through the monitoring and testing I described?
While I think a computer-based system would be better for people on average and in the long run, it would also be psychologically shattering to many teenagers who got the bad news that their dream career was not in the cards for them. However, it is also psychologically shattering to pursue such dreams and to fail after many years of struggle and financial expenditure. Better to get over it as early as possible, and to enter the workforce faster and as more of an asset to the economy, with no time and money wasted on useless degrees, dropped majors, and career mistakes.
Finally, the same level of technology and of its integration into the workforce could raise the value of capital throughout each person’s career arc. AI monitors would detect changes to each person’s skill sets and knowledge bases over time, as old things were forgotten and new things were learned. Having an up-to-date profile of a worker’s strengths and weaknesses would further optimize the process of linking them with positions for which they were best qualified. And through other forms of monitoring and analysis, AIs would come to understand the unique demands of each line of work and how those demands were changing, and to custom tailor continuing education “micro-credentialing” for workers to keep them optimized for their roles.