
Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts
Monday, July 8, 2019
"Facetime Fools" by Jen Sorensen

Labels:
Africa,
Asia,
car accident,
Europe,
face time,
government,
humanity,
latin america,
life,
North America,
people,
public,
road accident,
safety,
smartphone,
society,
tech,
technology,
world
Monday, October 15, 2018
"Domestic Spying" by Rob Rogers
Labels:
Africa,
America,
Asia,
Big Brother,
data,
domestic spying,
Europe,
government,
North America,
NSA,
people,
privacy,
public,
secret,
security,
social media,
technology,
US,
world
Tuesday, June 19, 2018
Are we liberated by tech or does it enslave us?
I don't think anyone would disagree with the notion that all technology, be it digital, is bad. As we say, any thing in the world is not bad, if you use it right. Heck, as National Rifle Association (NRA) of America says, "guns don't kill people, people kill people." Undoubtedly, they are correct to an extent. After all people were being killed before the invention of gun powder or guns.
Any technology, be it the phone in your hand, or that tablet in your lap, or that laptop on your desk, or those video game boxes attached to your giant TV, or the latest gadgets in your car, or even those kitchen & household machines, are not bad, in themselves, but ultimately their usage defines their "villainous" ability. Their "unintended consequences" are ultimately dependent on the user.
For instance, a common complaint from parents, & healthcare professionals, is that childhood diabetes is increasing all over the world, because children are not spending time outdoors, but are always engrossed in their smartphones & video game boxes. True. But the problem is not to ban that tech, but to take children out to park & spend time with them outside of the house. That's the responsibility of the parents. However, parents themselves are busy spending time on those tech marvels, like sticking their smartphones on their faces 20 hours a day. Due to spending so much time engrossed in their phones, they don't have any time left for their loved ones, to enjoy life (instead of watching & showing off how their lives are, to others), or to do healthy activities themselves, like sleeping 6-8 hours at night.
That's the problem at a micro level. Let's take an example at a macro level; digital technology creating (or will be creating) mass unemployment at a national, or even an international, level. People study & spend a significant portion of their lives in a specific profession or industry. Then, they get the shock of their lives when they are laid off because their skills are not in demand anymore, because digital tech is replacing their skills. In these kinds of situations, governments & industry need to step in & jointly take control.
It's true that nobody can control the march of technology but the damaging effects, or the "unintended consequences" can be controlled, or perhaps, mitigated to some extent. For example, all those people who are laid off should be retrained at the expense of the government, & those companies, which have disrupted the industry through their technology, should also financially contribute in the retraining of those people. Those people can also be hired by those same companies, after their retraining. Because, those people are a financial, economic, & social burden on the governments and society, but they can be tax-paying, productive part of the citizenry, who would pay back the cost of their retraining, to the government, in the form of taxes. Governments can also look ahead in the future & see what professions should be promoted through educational institutes & the educational pipeline (schools, colleges, & universities) & connect the educational side (the supply of labour) with the industry (the demand of labour), so the public has an idea as to what should be studied now to earn its fruits later on.
Technology, in itself, is never bad. As the author says in the opinion piece that its "unintended consequences" can also never be predicted beforehand. But technology's bad consequences are often the bad practices of users. Users need to keep in mind how to properly use that technology, & how that technology is affecting others; be it their loved ones, their social circle, their professional circle, or their community at large, or even themselves.
-----------------------------------------------------------------------------
Technology is unruly. New innovations bring with them a host of unintended consequences, ranging from the troubling to the downright depressing. Social media makes us lonely. Too much screen-time makes teenagers fall behind their peers. And at the more feeble end of the spectrum, many of us have walked into an obstacle while texting. Whatever glorious vision animates the moguls of Silicon Valley, it surely can’t be this.
We’re much better at designing complex systems than we are at predicting their behaviour, argues the writer Edward Tenner. Even though unintended consequences are inevitable, Tenner thinks they can be powerful catalysts for progress. But even the notion of an “intended consequence” is problematic when it comes to tech. Evgeny Morazov points out that we tend to confuse the positive consequences of information technology with intended ones, downplaying the significance of other natural, but rather less noble, upshots like pornography, surveillance and authoritarian control.
Free time is a case in point. Technology makes us more productive, but it’s also accused of unreasonably extending the domain of work. So does tech liberate us, or enslave us? And what does it really “intend” to do?
Tech and ‘free time’: a confusing picture
In 1930, the economist John Maynard Keynes predicted that the most pressing concern of the man of the future would be “how to occupy the leisure, which science and compound interest will have won for him.” It hasn’t quite turned out that way - but Keynes wasn’t entirely off the mark. When we consider the lot of the average labourer of the past, our complaints about work-life balance start to sound pretty peevish. And the rise of technology really has, it seems, given us more free time than ever. So why do we still feel harried?
It’s worth noting that modern leisure is just as tech-saturated as work. Americans who subscribe to Netflix spend more time on the site than they do eating and having sex combined, TDG research found. The average Briton spends 1 hour 20 minutes every day monitoring four social media accounts, according to research from the Global Web Index. But all this screen-time makes us uneasy. To co-opt David Foster Wallace’s description of attitudes to television in the 1990s, there’s a “weird hate-need-fear-6-hrs-daily gestalt” about the whole thing.
But technology doesn’t just offer us escape. It promises to transfigure our bodies, our minds and our very souls by making us fitter, happier, and more productive - but it does it by insinuating that we’re, well, a bit suboptimal as we are. “There’s an app for that” comes with a whispered aside: “You know you’re doing it wrong, right?”
Everyone’s a bit of a Luddite
Criticisms of tech can sound shrill, but it’s not antediluvian to notice the impossible desires technology breeds. Our devices present us with simulacra of beautiful, fit, fulfilled people pursuing their dreams and falling in love, and none of them are browsing the web at 11pm on a Saturday night - unlike us. We click and swipe our woebegone way through a vibrant world where nobody who is anybody spends their free time in front of a glowing screen, painfully aware that our only access to that world is through that very glowing screen.
But we’re no fools. We know that nothing on the web as it seems. We long to detach ourselves from the whole circus for once and for all - and so we turn once again to the internet to research digital detoxes and vent our tech-related spleen. The web has a way of dancing around us, knowingly and self-referentially and maddeningly deflecting every attempt we make to express our unease.
Is ‘free time’ a misnomer?
But prying our free time from the clutches of technology isn’t necessarily the answer. The German philosopher Theodor Adorno argued that “free time” is an artificial concept – and it’s anything but free. For Adorno, free time is the very propogation of work: it is “nothing more than a shadowy continuation of labour”.
Today’s tech-saturated leisure trade – to say nothing of the trillion-dollar behemoth that is the “wellness industry” – is an integral part of a world in which we are treated as consumers first and citizens second. Talk of reclaiming free time is missing the point. What we need is control of the time we already have.
But in yet another twist, this is just what Paul Mason thinks information technology might allow us to do. For Mason, the “sharing economy” contains within it the glimmer of a genuine alternative – a post-capitalist society structured around liberty instead of economics. If Mason is right, tech might free us from the need for “free time” entirely. But how does this complex narrative fit into the storybook of “unintended consequences”?
The myth of ‘unintended consequences’
Well, it doesn’t. Unintended consequences are a myth, because anticipating the effects of even the simplest innovation is a fool’s errand. Forget about information technology, or calculus, or Linear B: even the toaster would be a challenge.
Tech innovators frequently profess aspirations to improve the lot of mankind. Such aspirations are admirable, but we shouldn’t forget that there’s one rather more concrete intention they share: to make money. They’re vendors, we’re consumers: it’s as simple as that. Still, it’s a huge leap from there to the claim that tech is, in Foster Wallace’s words, a “diabolical corrupter of personal agency and community gumption”.
But even if tech companies aren’t really trying to enslave us, or to make us feel inadequate, that doesn’t mean that the current situation is a case of good intentions gone awry. There’s no more reason to think that tech is intrinsically good, but occasionally getting it wrong, than there is to think that it’s a remarkably successful villain.
We love to praise tech, and we love to condemn it. We equate it with chaos, power, love, hate; with democracy, with tyranny, with progress and regress - we laud it as our salvation, while lamenting it as our scourge. Like any technology that has come before it, digital technology is all of these things. But it’s essentially none of them.
Labels:
Africa,
Asia,
children,
computer,
digital,
Europe,
family,
government,
health,
industry,
internet,
latin america,
modern world,
North America,
public,
smartphone,
society,
tech,
technology,
unemployment
Monday, May 29, 2017
Israel, World Capital of Homeland Security Industries - Shir Hever on RAI
Some great insights into how Israeli defence industry are flourishing by exporting high-tech military equipment to developing countries. Israeli economy is becoming increasingly dependent on its defence industry. Since, defence industry is so important for Israeli economy, the elusive peace process between Israel & Palestine will always remain elusive. As Shir Hever says in the end that why would you want peace in the Middle East, when the occupation & war is bringing in cold, hard cash in the country.
------------------------------------------------------------------------------------------------------------------------
PAUL JAY, SENIOR EDITOR, TRNN: ... a lot of your work has been about the Israeli security state. I guess one of the things that doesn't get talked about enough ... is how much money gets made out of the security needs of Israel or the perceived security needs of Israel. Talk about how important all that is to the Israeli economy, particularly the Israeli elite.
SHIR HEVER, ECONOMIST, ALTERNATIVE INFORMATION CENTER: There was a sort of wave shape in the amount of dependency of Israel on the security economy and the military economy. During the Cold War years, especially in the '80s, there was a peak in massive investment in security and exports, and the Israeli industrial force was over 20% working for the military industry one way or another. Then that collapsed, especially during the '90s. There's a big shift to different sectors. Many of these big companies underwent a very severe crisis.
JAY: Why?
HEVER: Because the end of the Cold War meant lower demands. There was no investment in high-tech industry. There was also a deep social change in Israel with immigrants coming from the former Soviet Union which started to integrate into different sectors of the Israeli economy, mainly the high-tech sector, which was growing very fast. And so that meant that Israel was demilitarizing in many ways. And there was also this kind of hope that because of the Oslo negotiations there would be an end of the occupation. So international corporations started to invest in the Israeli economy in the belief or in the hope that it will become less militarized and the consumer economy would flourish.
All that was actually proven to be false, and starting from the year 2000, with the Second Intifada of Palestinians against the Israeli occupation, and even more after September 11, 2001, in the United States, there was a surge of the Israeli military industry again. And it's not the majority of the Israeli economy. It's not the biggest sector in Israel. But, nevertheless, there is no country in the world in which security plays such an important role as it is in Israel.
JAY: Now, before we get into the domestic situation, let's talk about how important security is as an Israeli export, and also the politics, 'cause it's interesting who they export to.
HEVER: Yeah, this is something that--Israel always played a very interesting role in the global arms trade, because there were some attempts by the Israeli military companies to compete with the U.S. military companies, which were crushed in the bud very quickly. And that shows exactly the moment in which U.S. support for Israel stops. Whenever an Israeli military company tries to sell something to China which has some U.S. technology in it or tries to compete with a U.S. company over producing a fighter plane, for example, that's the end of U.S.-Israeli alliance right there.
So Israeli companies evolved into targeting themselves as a sort of complementary to the U.S. military industry. The U.S. is the biggest arms exporter in the world by far, but Israel is the highest per capita arms exporter in the world. And while the U.S. specializes in big combat platforms (fighter planes, bombers, warships, helicopters, and so on, tanks), Israel is selling a lot of components that go with these platforms. So if you already got the F-16 from the U.S., you can buy special missiles, navigation systems, communications systems from Israeli companies that go with your F-16.
JAY: And if you go back historically, sometimes Israel would sell and could sell things to regimes even the Americans couldn't openly sell arms to.
HEVER: Yeah. And whenever you make a sort of comparison of which are the biggest arms exporter in the world, Israel is not very high--well, it's high compared to its size, but it's never in the top five. But when you ask which countries are selling to developing economies or to those tertiary markets, then Israel is quite high on the list.
JAY: And it included the South African apartheid regime, Columbia, Honduras, and it goes on and on.
HEVER: Yeah. So if we're talking about the '60s, '70s, '80s, Israel sold weapons to Rhodesia, which also had an apartheid regime; to South Africa, breaking the embargo very blatantly; to Guatemala during the civil wars; to Chile under Pinochet. Those countries which found it very difficult to convince the more mainstream weapon companies to sell them arms could always go to Israel.
JAY: And why--I mean, Israel's always trying to present itself as a democratic country. They're very interested in making sure that American--American Jews, particularly--support, send money to Israel. They never seem concerned. I mean, most American Jews are left-of-center, progressive in one way or the other, some of which except when it comes to Israel, but certainly when it came to South Africa and Chile and things like that. But they never seem to care about that, the Israeli governments.
HEVER: Well, sometimes it's because they do that in coordination with the U.S. government. Like, for example, in the Iran-Iraq War, where there was weapons being sold to both sides of the same time, then Israel was selling to Iran and the U.S. selling to Iraq, and it was coordinated. So in that case they're working with the U.S.
But also there was a sort of debate within Israel about how the arms industry is a strategic asset and who should they export to. And the question is: can the Israeli military grow stronger by this revenue that comes from exporting those kind of systems that are being developed for the Israeli army? Maybe those systems that are becoming a bit obsolete and outdated, we can sell them, use the money to develop something new.
And there was a worry in Israel after the occupation of 1967, because France, which was the biggest supplier of arms to Israel at the time, said they're going to embargo weapons sales to Israel unless Israel withdraws from the occupied territory.
JAY: What year was that?
HEVER: 1967. And the Israeli government was worried there's going to be an embargo. And at that point they said, we have to develop everything ourselves; we have to have a very strong military industry so that we can make our own tanks, our own cannons, and so on. They didn't know that they were actually going to be rewarded for the occupation and that France, which sold some weapons to Israel, is going to be replaced by a much bigger supplier in 1973, six years after the occupation, the United States. So they already made that strategic decision: they're going to reach out to new markets to try to sell Israeli weapon technology wherever they can.
But I think what we see in the last decade or so, especially after September 11, is that Israel has kind of shifted their target audience. It's not that they're looking for those countries that are under embargo to sell them the Israeli submachine guns. The famous submachine gun the uzi is no longer produced in Israel, actually. It's now made in China. But actually they were going to sell to those areas in which there is extreme inequality, extreme social resentment, to the governments, in order to repress that kind of uprising.
So, actually, Israel is now the world capital of homeland security industries. They're selling security cameras, surveillance equipments, drones, riot gear. That is the sort of technology that governments need in order to control their citizens. And it comes not just with the actual technology; it comes also with an ideology. It comes with the ideology that, look what Israel is doing, how Israel is controlling Palestinians and every aspect of their lives, and decides who can pass and who gets a permit and so on, and uses this technology to leave Palestinians no option to resist, and why don't we sell that to other governments around the world. For example, Brazil bought a lot of that technology in order to repress the favelas in preparation for the World Cup. We see that in India, not just in the area of Kashmir, but mainly there along the border with Pakistan, and in East Europe. And we also see that with extreme-right governments, like Berlusconi in Italy that was worried about asylum-seekers coming from Africa, and using Israeli drones and Israeli technology to try to block that, but also not just buying the technology, but also buying the legitimacy, saying Israel is a wonderful country. Berlusconi was a big pro-Israeli spokesman. And if Israel is allowed to do it, we can do it too.
JAY: And part of what the Israeli model for their sales, I guess, says to these regimes is you don't have to be worried about an uprising someday; you can repress people for decades and decades. Just look at us.
HEVER: Yeah. It's a very cynical worldview. It brings to mind 1984, that you can just use brute force to repress resistance.
I think there is a limit to how much it can work. And there's also a limit to understanding of how you can use it and where. Now that we're watching what's happening in the Ukraine, it brings to mind what happened in Georgia in 2008, in which a failed Israeli general, Gal Hirsch, who actually did terribly in the war against Lebanon of 2006, formed his own security company, went to Georgia, and talked to the government about selling them Israeli equipment. And the Georgian government believed, because of this prestige of the Israeli army, their equipment would be able to stop the Russian army. Now, we know what happened in the end in 2008. They were immediately defeated. So it actually goes to show that the Israeli army has completely lost its preparation and its technologies that were designed to fight other armies. The Israeli army hasn't engaged in a conventional war for 40 years. They're now completely concentrated on fighting civilians and repressing them.
JAY: And at the cutting edge. You mentioned drones and surveillance equipment. I saw ... Netanyahu was in California and made a deal with Jerry Brown to make deals with Silicon Valley, and cyber security is one of the things they want to work on. There's a lot of integration or interpenetration between American capital and American security, intelligence, and Silicon Valley with Israel's intellectual capacity, money, and security industry. And they have their political representatives, too.
HEVER: Yeah. But I think it's falling apart, because the Israeli high-tech industry has grown, like I said, in the '90s very rapidly. But it grew where many of these companies, their dream was to be bought by a U.S. company and then they can leave. A lot of the very talented Israeli high-tech entrepreneurs found themselves very happily moving to other countries. And then, between 2000 and 2008, there were eight years--so, after the crash of the NASDAQ of 2000 and until 2008, the next economic crisis, the average increase in value of Israeli high-tech companies was zero percent over eight years. During that time--that's because a lot of those companies collapsed and lost everything. And those who survived were mainly tied to the security apparatus, and their biggest customer is the Israeli army.
...
...
JAY: ...if this is an increasingly big sector of the Israeli economy, ... the interests of this sector certainly are not to have a peace agreement. How much does that influence politics?
HEVER: Well, it's not easy to show how that influences directly. You don't really see how that lobby works. But you can see that Israel's former prime minister and former minister of defense, Ehud Barak, he has done many political mistakes in the last couple of years, and it seemed that he is not going to be able to get into the government again. So he said, I'm now going to do what I actually like to do best: I am going to the private sector. And then it becomes apparent that he has many friends who own these security companies, and he can open doors for them, and he can get a lot of money from them. So, obviously, these security companies' business model is built on the occupation. These are companies that their motto when they go to arms trade shows and show their equipment, they say, this has already been tested by the Israeli army on actual people. You can only have that because of the occupation. So every new weapon is first sold to the Israeli army, shot at Palestinians. Then you can sell it.
JAY: Yeah, and they probably have nice little sales videos showing how this all works.
HEVER: Of course. Yeah. After this invasion of Gaza that we were talking about, there was a trade show that the Israeli army did where they showed how each and every of these new inventions were used in the attack on Gaza, completely shamelessly.
JAY: So the attacks become demos.
HEVER: The attacks become demos, and these companies make a profit out of it, and then these companies are hiring senior Israeli officials. I don't think that means that they want to end the peace process or sabotage the peace process; it means they want to continue it forever, because as long as it continues, they can continue these periodic attacks and they can continue the occupation.
JAY: Yeah, 'cause the peace process is a process of never come to an agreement about peace.
Labels:
army,
control,
defence,
Europe,
Gaza,
Israel,
Jews,
Middle East,
military,
North America,
occupation,
Palestine,
peace,
repression,
technology,
US,
violence,
war,
weapon,
world
Thursday, March 31, 2016
What's the point of college?
This whole piece from New Yorker is a fantastic one. It's quite a long one, so I will keep my blog post on it rather short.
What I like to say here is that I've been saying what this article is saying for, at least, a year or so. Education nowadays doesn't have the same respect or weight as it used to carry about 5 decades ago. For instance, I met an 18-year-old about 6 months ago who said to me that he is learning to be a salesperson, right after graduating from high school, & have no intention to go to post-secondary school, because business education is learned best out in the field than in a class. So, in essence, he will save thousands upon thousands of dollars in education & will ultimately come out ahead because he will have hands-on work experience coupled with a valuable network. (I couldn't reply since my own experiences are not so great).
Everyone, from Canada to Chile to Egypt to South Africa to Spain to Russia to Australia to Saudi Arabia, loves to talk about how education is essential for everyone, regardless to gender, skin colour, language, religion etc. And, in an ideal & utopian world, it definitely will be ... but, we are not living in an ideal & utopian world. Education has become a business in itself. It has become so expensive to even get a 4-year bachelor's degree that families & students take out thousands of student loans to earn that degree.
But after graduation a serious question needs to be asked is that degree, for which I just paid thousands of dollars (or whatever other currency), worth anything? Personally, my 2 degrees, a Bachelor's in Accounting & an MBA, for which I paid about 100,000 Canadian dollars (excluding my time & effort) in total, & 2 certificates (Six Sigma & Project Management) have earned me a job as a cashier in a local superstore. So, for me, my education was definitely not worth it. I would have been better off by not getting an education & spent my time working in retail, for instance. That decade in education would've saved me some hard cold cash & I would've been a supervisor, or even a manager, of a store by now. The article does point out this fact that how highly educated graduates, even STEM graduates (science, tech, engineering, & mathematics), are working in measly jobs, perhaps, due to technological advancements.
So, education is important but we need to look at its worth. The article states examples of colleges & universities, & a lot of statistics & studies, in trying to answer the primary question of the worth of education. Eventually, it asks what is the purpose of colleges & universities, since their graduates, which are constantly increasing in number, are not really earning that much, & their wages are actually falling. This seems counterintuitive since education is supposed to increase your worth, & not decrease it in the labour market.
A theory the article puts forward is that degrees & certificates perhaps try to signal to potential employers the competency of the candidate. Potential employers are thus merely using the degrees as a filter in the hiring process. Perhaps, that is why, "branding" of a university comes into play. Since, everyone around you is getting a degree, if you want to stand out from the crowd, you need to "buy" a degree from a so-called "prestigious" school. Then, it becomes an arms race, & it is only helping educational institutes & financial services companies (i.e. banks) in making huge profits.
Unfortunately, the article stops short of giving a definite reason why this is happening that education is not paying off for many, nowadays. One reason it gives, with which I do agree, is that this blind corporate race of cost-cutting is increasing more tech solutions implementation with fewer graduates being hired to train for future management positions.
One other reason I would give for education not paying off in modern times is the absence of meritocracy. We all like to think that education improves the personal financial bottom line, but we forget, that it will happen only when the labour market is merit based, which it is not. We all have heard of that accursed word, "networking," which merely implies that your education & qualifications matter less if you know the right people in the right places. Nowadays, employers get so many resumes / CVs / bio-datas, which are all essentially similar, too, that instead of sifting through all of them to find that perfect candidate, employers simply ask the people around them to recommend them a candidate.
This disease of "networking" is very common in developing countries because, due to their large population, & in their push for more education, everyone has a similar educational base. But networking is relatively new in Western countries because up until a few years ago, your education still played a major part what job & earning potential you ended up with in your life. It was somewhat of a meritocracy. But not anymore. Networking disrupts a level playing field, since, your education & qualifications are useless in the face of "how many influential people are your / family's friends".
Anyway, my blog post has become a long one & I would prefer it that you read the article, which is still quite informative & thought-provoking. We all like to think that education is essential to succeed in life but we forget that the world is an unfair place, & in the near future, your education will matter less & your connections will matter more.
---------------------------------------------------------------------------------
If there is one thing most Americans have been able to agree on over the years, it is that getting an education, particularly a college education, is a key to human betterment & prosperity. ... Already, the cost of higher education has become a big issue in the 2016 Presidential campaign. ...
Promoters of higher education have long emphasized its role in meeting civic needs. The Puritans who established Harvard were concerned about a shortage of clergy; during the Progressive Era, John Dewey insisted that a proper education would make people better citizens, with enlarged moral imaginations. Recently, as wage stagnation & rising inequality have emerged as serious problems, the economic arguments for higher education have come to the fore. “Earning a post-secondary degree or credential is no longer just a pathway to opportunity for a talented few,” the White House Web site states. “Rather, it is a prerequisite for the growing jobs of the new economy.” Commentators & academic economists have claimed that college doesn’t merely help individuals get higher-paying jobs; it raises wages throughout the economy & helps ameliorate rising inequality. In an influential 2008 book, “The Race Between Education and Technology,” the Harvard economists Claudia Goldin & Lawrence F. Katz argued that technological progress has dramatically increased the demand for skilled workers, & that, in recent decades, the American educational system has failed to meet the challenge by supplying enough graduates who can carry out the tasks that a high-tech economy requires. “Not so long ago, the American economy grew rapidly and wages grew in tandem, with education playing a large, positive role in both,” they wrote in a subsequent paper. “The challenge now is to revitalize education-based mobility.”
The “message from the media, from the business community, and even from many parts of the government has been that a college degree is more important than ever in order to have a good career,” Peter Cappelli, a professor of management at Wharton, notes in his informative & refreshingly skeptical new book, “Will College Pay Off?” (PublicAffairs). “As a result, families feel even more pressure to send their kids to college. This is at a time when more families find those costs to be a serious burden.” During recent decades, tuition & other charges have risen sharply—many colleges charge more than 50,000 dollars a year in tuition & fees. Even if you factor in the expansion of financial aid, Cappelli reports, “students in the United States pay about four times more than their peers in countries elsewhere.”
Despite the increasing costs — & the claims about a shortage of college graduates — the number of people attending & graduating from four-year educational institutions keeps going up. In the 2000-01 academic year, American colleges awarded almost 1.3 million bachelor’s degrees. A decade later, the figure had jumped nearly 40%, to more than 1.7 million. About 70% of all high-school graduates now go on to college, & half of all Americans between the ages of 25 & 34 have a college degree. That’s a big change. In 1980, only 1 in 6 Americans 25 & older were college graduates. 50 years ago, it was fewer than 1 in 10. To cater to all the new students, colleges keep expanding & adding courses, many of them vocationally inclined. At Kansas State, undergraduates can major in Bakery Science & Management or Wildlife & Outdoor Enterprise Management. They can minor in Unmanned Aircraft Systems or Pet Food Science. Oklahoma State offers a degree in Fire Protection & Safety Engineering & Technology. At Utica College, you can major in Economic Crime Detection.
In the fast-growing for-profit college sector, which now accounts for more than 10% of all students, vocational degrees are the norm. DeVry University — which last year taught more than 60,000 students, at more than 75 campuses — offers majors in everything from multimedia design & development to health-care administration. On its Web site, DeVry boasts, “In 2013, 90% of DeVry University associate and bachelor’s degree grads actively seeking employment had careers in their field within six months of graduation.” That sounds impressive — until you notice that the figure includes those graduates who had jobs in their field before graduation. (Many DeVry students are working adults who attend college part-time to further their careers.) Nor is the phrase “in their field” clearly defined. “Would you be okay rolling the dice on a degree in communications based on information like that?” Cappelli writes. He notes that research by the nonprofit National Association of Colleges & Employers found that, in the same year, just 6.5% of graduates with communications degrees were offered jobs in the field. It may be unfair to single out DeVry, which is one of the more reputable for-profit education providers. But the example illustrates Cappelli’s larger point: many of the claims that are made about higher education don’t stand up to scrutiny.
“It is certainly true that college has been life changing for most people and a tremendous financial investment for many of them,” Cappelli writes. “It is also true that for some people, it has been financially crippling. . . .The world of college education is different now than it was a generation ago, when many of the people driving policy decisions on education went to college, and the theoretical ideas about why college should pay off do not comport well with the reality.”
No idea has had more influence on education policy than the notion that colleges teach their students specific, marketable skills, which they can use to get a good job. Economists refer to this as the “human capital” theory of education, & for the past 20 or 30 years it has gone largely unchallenged. If you’ve completed a two-year associate’s degree, you’ve got more “human capital” than a high-school graduate. And if you’ve completed a four-year bachelor’s degree you’ve got more “human capital” than someone who attended a community college. Once you enter the labor market, the theory says, you will be rewarded with a better job, brighter career prospects, & higher wages.
There’s no doubt that college graduates earn more money, on average, than people who don’t have a degree. And for many years the so-called “college wage premium” grew. In 1970, according to a recent study by researchers at the Federal Reserve Bank of New York, people with a bachelor’s degree earned about 60,000 dollars a year, on average, & people with a high-school diploma earned about 45,000 dollars. 35 years later, in 2005, the average earnings of college graduates had risen to more than 70,000 dollars, while high-school graduates had seen their earnings fall slightly. (All these figures are inflation-adjusted.) The fact that the college wage premium went up at a time when the supply of graduates was expanding significantly seemed to confirm the Goldin-Katz theory that technological change was creating an ever-increasing demand for workers with a lot of human capital.
During the past decade or so, however, a number of things have happened that don’t easily mesh with that theory. If college graduates remain in short supply, their wages should still be rising. But they aren’t. In 2001, according to the Economic Policy Institute, a liberal think tank in Washington, workers with undergraduate degrees (but not graduate degrees) earned, on average, $30.05 an hour; last year, they earned $29.55 an hour. Other sources show even more dramatic falls. “Between 2001 and 2013, the average wage of workers with a bachelor’s degree declined 10.3 percent, and the average wage of those with an associate’s degree declined 11.1 percent,” the New York Fed reported in its study. Wages have been falling most steeply of all among newly minted college graduates. And jobless rates have been rising. In 2007, 5.5% of college graduates under the age of 25 were out of work. Today, the figure is close to 9%. If getting a bachelor’s degree is meant to guarantee entry to an arena in which jobs are plentiful & wages rise steadily, the education system has been failing for some time.
And, while college graduates are still doing a lot better than nongraduates, some studies show that the earnings gap has stopped growing. The figures need careful parsing. If you lump college graduates in with people with advanced degrees, the picture looks brighter. But almost all the recent gains have gone to folks with graduate degrees. “The four-year-degree premium has remained flat over the past decade,” the Federal Reserve Bank of Cleveland reported. And one of the main reasons it went up in the first place wasn’t that college graduates were enjoying significantly higher wages. It was that the earnings of nongraduates were falling.
Many students & their families extend themselves to pay for a college education out of fear of falling into the low-wage economy. That’s perfectly understandable. But how sound an investment is it? One way to figure this out is to treat a college degree like a stock or a bond & compare the cost of obtaining one with the accumulated returns that it generates over the years. (In this case, the returns come in the form of wages over & above those earned by people who don’t hold degrees.) When the research firm PayScale did this a few years ago, it found that the average inflation-adjusted return on a college education is about 7%, which is a bit lower than the historical rate of return on the stock market. Cappelli cites this study along with one from the Hamilton Project, a Washington-based research group that came up with a much higher figure — about 15% — but by assuming, for example, that all college students graduate in 4 years. (In fact, the four-year graduation rate for full-time, first-degree students is less than 40%, & the six-year graduation rate is less than 60%.)
These types of studies, & there are lots of them, usually find that the financial benefits of getting a college degree are much larger than the financial costs. But Cappelli points out that for parents & students the average figures may not mean much, because they disguise enormous differences in outcomes from school to school. He cites a survey, carried out by PayScale for Businessweek in 2012, that showed that students who attend M.I.T., Caltech, & Harvey Mudd College enjoy an annual return of more than 10% on their “investment.” But the survey also found almost 200 colleges where students, on average, never fully recouped the costs of their education. “The big news about the payoff from college should be the incredible variation in it across colleges,” Cappelli writes. “Looking at the actual return on the costs of attending college, careful analyses suggest that the payoff from many college programs—as much as 1 in 4—is actually negative. Incredibly, the schools seem to add nothing to the market value of the students.”
So what purpose does college really serve for students & employers? Before the human-capital theory became so popular, there was another view of higher education—as, in part, a filter, or screening device, that sorted individuals according to their aptitudes & conveyed this information to businesses & other hiring institutions. By completing a four-year degree, students could signal to potential employers that they had a certain level of cognitive competence & could carry out assigned tasks & work in a group setting. But a college education didn’t necessarily imbue students with specific work skills that employers needed, or make them more productive.
Kenneth Arrow, one of the giants of twentieth-century economics, came up with this account, & if you take it seriously you can’t assume that it’s always a good thing to persuade more people to go to college. If almost everybody has a college degree, getting one doesn’t differentiate you from the pack. To get the job you want, you might have to go to a fancy (& expensive) college, or get a higher degree. Education turns into an arms race, which primarily benefits the arms manufacturers—in this case, colleges & universities.
The screening model isn’t very fashionable these days, partly because it seems perverse to suggest that education doesn’t boost productivity. But there’s quite a bit of evidence that seems to support Arrow’s theory. In recent years, more jobs have come to demand a college degree as an entry requirement, even though the demands of the jobs haven’t changed much. Some nursing positions are on the list, along with jobs for executive secretaries, salespeople, & distribution managers. According to one study, just 20% of executive assistants & insurance-claims clerks have college degrees but more than 45% of the job openings in the field require one. “This suggests that employers may be relying on a B.A. as a broad recruitment filter that may or may not correspond to specific capabilities needed to do the job,” the study concluded.
It is well established that students who go to elite colleges tend to earn more than graduates of less selective institutions. But is this because Harvard & Princeton do a better job of teaching valuable skills than other places, or because employers believe that they get more talented students to begin with? An exercise carried out by Lauren Rivera, of the Kellogg School of Management, at Northwestern, strongly suggests that it’s the latter. Rivera interviewed more than a hundred recruiters from investment banks, law firms, & management consulting firms, & she found that they recruited almost exclusively from the very top-ranked schools, & simply ignored most other applicants. The recruiters didn’t pay much attention to things like grades & majors. “It was not the content of education that elite employers valued but rather its prestige,” Rivera concluded.
If higher education serves primarily as a sorting mechanism, that might help explain another disturbing development: the tendency of many college graduates to take jobs that don’t require college degrees. Practically everyone seems to know a well-educated young person who is working in a bar or a mundane clerical job, because he or she can’t find anything better. Doubtless, the Great Recession & its aftermath are partly to blame. But something deeper, & more lasting, also seems to be happening.
In the Goldin-Katz view of things, technological progress generates an ever-increasing need for highly educated, highly skilled workers. But, beginning in about 2000, for reasons that are still not fully understood, the pace of job creation in high-paying, highly skilled fields slowed significantly. To demonstrate this, 3 Canadian economists, Paul Beaudry, David A. Green, & Benjamin M. Sand, divided the US workforce into a hundred occupations, ranked by their average wages, & looked at how employment has changed in each category. Since 2000, the economists showed, the demand for highly educated workers declined, while job growth in low-paying occupations increased strongly. “High-skilled workers have moved down the occupational ladder and have begun to perform jobs traditionally performed by lower-skilled workers,” they concluded, thus “pushing low-skilled workers even further down the occupational ladder.”
Increasingly, the competition for jobs is taking place in areas of the labor market where college graduates didn’t previously tend to compete. As Beaudry, Green, & Sand put it, “having a B.A. is less about obtaining access to high paying managerial and technology jobs and more about beating out less educated workers for the Barista or clerical job.” Even many graduates in science, technology, engineering, & mathematics—the so-called STEM subjects, which receive so much official encouragement—are having a tough time getting the jobs they’d like. Cappelli reports that only about a fifth of recent graduates with STEM degrees got jobs that made use of that training. “The evidence for recent grads suggests clearly that there is no overall shortage of STEM grads,” he writes.
Why is this happening? The short answer is that nobody knows for sure. One theory is that corporate cost-cutting, having thinned the ranks of workers on the factory floor & in routine office jobs, is now targeting supervisors, managers, & other highly educated people. Another theory is that technological progress, after favoring highly educated workers for a long time, is now turning on them. With rapid advances in processing power, data analysis, voice recognition, & other forms of artificial intelligence, computers can perform tasks that were previously carried out by college graduates, such as analyzing trends, translating foreign-language documents, & filing tax returns. In “The Second Machine Age” (Norton), the M.I.T. professors Erik Brynjolfsson & Andrew McAfee sketch a future where computers will start replacing doctors, lawyers, & many other highly educated professionals. “As digital labor becomes more pervasive, capable, and powerful,” they write, “companies will be increasingly unwilling to pay people wages that they’ll accept, and that will allow them to maintain the standard of living to which they’ve been accustomed.”
Cappelli stresses the change in corporate hiring patterns. In the old days, Fortune 500 companies such as General Motors, Citigroup, & I.B.M. took on large numbers of college graduates & trained them for a lifetime at the company. But corporations now invest less in education & training, &, instead of promoting someone, or finding someone in the company to fill a specialized role, they tend to hire from outside. Grooming the next generation of leadership is much less of a concern. “What employers want from college graduates now is the same thing they want from applicants who have been out of school for years, and that is job skills and the ability to contribute now,” Cappelli writes. “That change is fundamental, and it is the reason that getting a good job out of college is now such a challenge.”
Obtaining a vocational degree or certificate is one strategy that many students employ to make themselves attractive to employers, &, on the face of it, this seems sensible. If you’d like to be a radiology technician, shouldn’t you get a B.A. in radiology? If you want to run a bakery, why not apply to Kansas State & sign up for that major in Bakery Science? But narrowly focussed degrees are risky. “If you graduate in a year when gambling is up and the casinos like your casino management degree, you probably have hit it big,” Cappelli writes. “If they aren’t hiring when you graduate, you may be even worse off getting a first job with that degree anywhere else precisely because it was so tuned to that group of employers.” During the dot-com era, enrollment in computer-science & information-technology programs rose sharply. After the bursting of the stock-market bubble, many of these graduates couldn’t find work. “Employers who say that we need more engineers or IT grads are not promising to hire them when they graduate in four years,” Cappelli notes. “Pushing kids into a field like health care because someone believes there is a need there now will not guarantee that they all get jobs &, if they do, that those jobs will be as good as workers in that field have now.”
So what’s the solution? Some people believe that online learning will provide a viable low-cost alternative to a live-in college education. ... Another approach is to direct more students & resources to two-year community colleges & other educational institutions that cost less than four-year colleges. President Obama recently called for all qualified high-school students to be guaranteed a place in community college, & for tuition fees to be eliminated. Such policies would reverse recent history. In a new book, “Learning by Doing: The Real Connection between Innovation, Wages, and Wealth” (Yale), James Bessen, a technology entrepreneur who also teaches at Boston University School of Law, points out that “the policy trend over the last decade has been to starve community colleges in order to feed four-year colleges, especially private research universities.” Some of the discrepancies are glaring. Richard Vedder, who teaches economics at Ohio University, calculated that in 2010 Princeton, which had an endowment of close to fifteen billion dollars, received state & federal benefits equivalent to roughly 50,000 dollars per student, whereas the nearby College of New Jersey got benefits of just 2,000 dollars per student. There are sound reasons for rewarding excellence & sponsoring institutions that do important scientific research. But is a twenty-five-to-one difference in government support really justified?
Perhaps the strongest argument for caring about higher education is that it can increase social mobility, regardless of whether the human-capital theory or the signalling theory is correct. A recent study by researchers at the Federal Reserve Bank of San Francisco showed that children who are born into households in the poorest fifth of the income distribution are 6 times as likely to reach the top fifth if they graduate from college. Providing access to college for more kids from deprived backgrounds helps nurture talents that might otherwise go to waste, & it’s the right thing to do. (Of course, if college attendance were practically universal, having a degree would send a weaker signal to employers.) But increasing the number of graduates seems unlikely to reverse the over-all decline of high-paying jobs, & it won’t resolve the income-inequality problem, either. As the economist Lawrence Summers & two colleagues showed in a recent simulation, even if we magically summoned up college degrees for a tenth of all the working-age American men who don’t have them—by historical standards, a big boost in college-graduation rates—we’d scarcely change the existing concentration of income at the very top of the earnings distribution, where C.E.O.s & hedge-fund managers live.
Being more realistic about the role that college degrees play would help families & politicians make better choices. It could also help us appreciate the actual merits of a traditional broad-based education, often called a liberal-arts education, rather than trying to reduce everything to an economic cost-benefit analysis. “To be clear, the idea is not that there will be a big financial payoff to a liberal arts degree,” Cappelli writes. “It is that there is no guarantee of a payoff from very practical, work-based degrees either, yet that is all those degrees promise. For liberal arts, the claim is different and seems more accurate, that it will enrich your life and provide lessons that extend beyond any individual job. There are centuries of experience providing support for that notion.”
What I like to say here is that I've been saying what this article is saying for, at least, a year or so. Education nowadays doesn't have the same respect or weight as it used to carry about 5 decades ago. For instance, I met an 18-year-old about 6 months ago who said to me that he is learning to be a salesperson, right after graduating from high school, & have no intention to go to post-secondary school, because business education is learned best out in the field than in a class. So, in essence, he will save thousands upon thousands of dollars in education & will ultimately come out ahead because he will have hands-on work experience coupled with a valuable network. (I couldn't reply since my own experiences are not so great).
Everyone, from Canada to Chile to Egypt to South Africa to Spain to Russia to Australia to Saudi Arabia, loves to talk about how education is essential for everyone, regardless to gender, skin colour, language, religion etc. And, in an ideal & utopian world, it definitely will be ... but, we are not living in an ideal & utopian world. Education has become a business in itself. It has become so expensive to even get a 4-year bachelor's degree that families & students take out thousands of student loans to earn that degree.
But after graduation a serious question needs to be asked is that degree, for which I just paid thousands of dollars (or whatever other currency), worth anything? Personally, my 2 degrees, a Bachelor's in Accounting & an MBA, for which I paid about 100,000 Canadian dollars (excluding my time & effort) in total, & 2 certificates (Six Sigma & Project Management) have earned me a job as a cashier in a local superstore. So, for me, my education was definitely not worth it. I would have been better off by not getting an education & spent my time working in retail, for instance. That decade in education would've saved me some hard cold cash & I would've been a supervisor, or even a manager, of a store by now. The article does point out this fact that how highly educated graduates, even STEM graduates (science, tech, engineering, & mathematics), are working in measly jobs, perhaps, due to technological advancements.
So, education is important but we need to look at its worth. The article states examples of colleges & universities, & a lot of statistics & studies, in trying to answer the primary question of the worth of education. Eventually, it asks what is the purpose of colleges & universities, since their graduates, which are constantly increasing in number, are not really earning that much, & their wages are actually falling. This seems counterintuitive since education is supposed to increase your worth, & not decrease it in the labour market.
A theory the article puts forward is that degrees & certificates perhaps try to signal to potential employers the competency of the candidate. Potential employers are thus merely using the degrees as a filter in the hiring process. Perhaps, that is why, "branding" of a university comes into play. Since, everyone around you is getting a degree, if you want to stand out from the crowd, you need to "buy" a degree from a so-called "prestigious" school. Then, it becomes an arms race, & it is only helping educational institutes & financial services companies (i.e. banks) in making huge profits.
Unfortunately, the article stops short of giving a definite reason why this is happening that education is not paying off for many, nowadays. One reason it gives, with which I do agree, is that this blind corporate race of cost-cutting is increasing more tech solutions implementation with fewer graduates being hired to train for future management positions.
One other reason I would give for education not paying off in modern times is the absence of meritocracy. We all like to think that education improves the personal financial bottom line, but we forget, that it will happen only when the labour market is merit based, which it is not. We all have heard of that accursed word, "networking," which merely implies that your education & qualifications matter less if you know the right people in the right places. Nowadays, employers get so many resumes / CVs / bio-datas, which are all essentially similar, too, that instead of sifting through all of them to find that perfect candidate, employers simply ask the people around them to recommend them a candidate.
This disease of "networking" is very common in developing countries because, due to their large population, & in their push for more education, everyone has a similar educational base. But networking is relatively new in Western countries because up until a few years ago, your education still played a major part what job & earning potential you ended up with in your life. It was somewhat of a meritocracy. But not anymore. Networking disrupts a level playing field, since, your education & qualifications are useless in the face of "how many influential people are your / family's friends".
Anyway, my blog post has become a long one & I would prefer it that you read the article, which is still quite informative & thought-provoking. We all like to think that education is essential to succeed in life but we forget that the world is an unfair place, & in the near future, your education will matter less & your connections will matter more.
---------------------------------------------------------------------------------
If there is one thing most Americans have been able to agree on over the years, it is that getting an education, particularly a college education, is a key to human betterment & prosperity. ... Already, the cost of higher education has become a big issue in the 2016 Presidential campaign. ...
Promoters of higher education have long emphasized its role in meeting civic needs. The Puritans who established Harvard were concerned about a shortage of clergy; during the Progressive Era, John Dewey insisted that a proper education would make people better citizens, with enlarged moral imaginations. Recently, as wage stagnation & rising inequality have emerged as serious problems, the economic arguments for higher education have come to the fore. “Earning a post-secondary degree or credential is no longer just a pathway to opportunity for a talented few,” the White House Web site states. “Rather, it is a prerequisite for the growing jobs of the new economy.” Commentators & academic economists have claimed that college doesn’t merely help individuals get higher-paying jobs; it raises wages throughout the economy & helps ameliorate rising inequality. In an influential 2008 book, “The Race Between Education and Technology,” the Harvard economists Claudia Goldin & Lawrence F. Katz argued that technological progress has dramatically increased the demand for skilled workers, & that, in recent decades, the American educational system has failed to meet the challenge by supplying enough graduates who can carry out the tasks that a high-tech economy requires. “Not so long ago, the American economy grew rapidly and wages grew in tandem, with education playing a large, positive role in both,” they wrote in a subsequent paper. “The challenge now is to revitalize education-based mobility.”
The “message from the media, from the business community, and even from many parts of the government has been that a college degree is more important than ever in order to have a good career,” Peter Cappelli, a professor of management at Wharton, notes in his informative & refreshingly skeptical new book, “Will College Pay Off?” (PublicAffairs). “As a result, families feel even more pressure to send their kids to college. This is at a time when more families find those costs to be a serious burden.” During recent decades, tuition & other charges have risen sharply—many colleges charge more than 50,000 dollars a year in tuition & fees. Even if you factor in the expansion of financial aid, Cappelli reports, “students in the United States pay about four times more than their peers in countries elsewhere.”
Despite the increasing costs — & the claims about a shortage of college graduates — the number of people attending & graduating from four-year educational institutions keeps going up. In the 2000-01 academic year, American colleges awarded almost 1.3 million bachelor’s degrees. A decade later, the figure had jumped nearly 40%, to more than 1.7 million. About 70% of all high-school graduates now go on to college, & half of all Americans between the ages of 25 & 34 have a college degree. That’s a big change. In 1980, only 1 in 6 Americans 25 & older were college graduates. 50 years ago, it was fewer than 1 in 10. To cater to all the new students, colleges keep expanding & adding courses, many of them vocationally inclined. At Kansas State, undergraduates can major in Bakery Science & Management or Wildlife & Outdoor Enterprise Management. They can minor in Unmanned Aircraft Systems or Pet Food Science. Oklahoma State offers a degree in Fire Protection & Safety Engineering & Technology. At Utica College, you can major in Economic Crime Detection.
In the fast-growing for-profit college sector, which now accounts for more than 10% of all students, vocational degrees are the norm. DeVry University — which last year taught more than 60,000 students, at more than 75 campuses — offers majors in everything from multimedia design & development to health-care administration. On its Web site, DeVry boasts, “In 2013, 90% of DeVry University associate and bachelor’s degree grads actively seeking employment had careers in their field within six months of graduation.” That sounds impressive — until you notice that the figure includes those graduates who had jobs in their field before graduation. (Many DeVry students are working adults who attend college part-time to further their careers.) Nor is the phrase “in their field” clearly defined. “Would you be okay rolling the dice on a degree in communications based on information like that?” Cappelli writes. He notes that research by the nonprofit National Association of Colleges & Employers found that, in the same year, just 6.5% of graduates with communications degrees were offered jobs in the field. It may be unfair to single out DeVry, which is one of the more reputable for-profit education providers. But the example illustrates Cappelli’s larger point: many of the claims that are made about higher education don’t stand up to scrutiny.
“It is certainly true that college has been life changing for most people and a tremendous financial investment for many of them,” Cappelli writes. “It is also true that for some people, it has been financially crippling. . . .The world of college education is different now than it was a generation ago, when many of the people driving policy decisions on education went to college, and the theoretical ideas about why college should pay off do not comport well with the reality.”
No idea has had more influence on education policy than the notion that colleges teach their students specific, marketable skills, which they can use to get a good job. Economists refer to this as the “human capital” theory of education, & for the past 20 or 30 years it has gone largely unchallenged. If you’ve completed a two-year associate’s degree, you’ve got more “human capital” than a high-school graduate. And if you’ve completed a four-year bachelor’s degree you’ve got more “human capital” than someone who attended a community college. Once you enter the labor market, the theory says, you will be rewarded with a better job, brighter career prospects, & higher wages.
There’s no doubt that college graduates earn more money, on average, than people who don’t have a degree. And for many years the so-called “college wage premium” grew. In 1970, according to a recent study by researchers at the Federal Reserve Bank of New York, people with a bachelor’s degree earned about 60,000 dollars a year, on average, & people with a high-school diploma earned about 45,000 dollars. 35 years later, in 2005, the average earnings of college graduates had risen to more than 70,000 dollars, while high-school graduates had seen their earnings fall slightly. (All these figures are inflation-adjusted.) The fact that the college wage premium went up at a time when the supply of graduates was expanding significantly seemed to confirm the Goldin-Katz theory that technological change was creating an ever-increasing demand for workers with a lot of human capital.
During the past decade or so, however, a number of things have happened that don’t easily mesh with that theory. If college graduates remain in short supply, their wages should still be rising. But they aren’t. In 2001, according to the Economic Policy Institute, a liberal think tank in Washington, workers with undergraduate degrees (but not graduate degrees) earned, on average, $30.05 an hour; last year, they earned $29.55 an hour. Other sources show even more dramatic falls. “Between 2001 and 2013, the average wage of workers with a bachelor’s degree declined 10.3 percent, and the average wage of those with an associate’s degree declined 11.1 percent,” the New York Fed reported in its study. Wages have been falling most steeply of all among newly minted college graduates. And jobless rates have been rising. In 2007, 5.5% of college graduates under the age of 25 were out of work. Today, the figure is close to 9%. If getting a bachelor’s degree is meant to guarantee entry to an arena in which jobs are plentiful & wages rise steadily, the education system has been failing for some time.
And, while college graduates are still doing a lot better than nongraduates, some studies show that the earnings gap has stopped growing. The figures need careful parsing. If you lump college graduates in with people with advanced degrees, the picture looks brighter. But almost all the recent gains have gone to folks with graduate degrees. “The four-year-degree premium has remained flat over the past decade,” the Federal Reserve Bank of Cleveland reported. And one of the main reasons it went up in the first place wasn’t that college graduates were enjoying significantly higher wages. It was that the earnings of nongraduates were falling.
Many students & their families extend themselves to pay for a college education out of fear of falling into the low-wage economy. That’s perfectly understandable. But how sound an investment is it? One way to figure this out is to treat a college degree like a stock or a bond & compare the cost of obtaining one with the accumulated returns that it generates over the years. (In this case, the returns come in the form of wages over & above those earned by people who don’t hold degrees.) When the research firm PayScale did this a few years ago, it found that the average inflation-adjusted return on a college education is about 7%, which is a bit lower than the historical rate of return on the stock market. Cappelli cites this study along with one from the Hamilton Project, a Washington-based research group that came up with a much higher figure — about 15% — but by assuming, for example, that all college students graduate in 4 years. (In fact, the four-year graduation rate for full-time, first-degree students is less than 40%, & the six-year graduation rate is less than 60%.)
These types of studies, & there are lots of them, usually find that the financial benefits of getting a college degree are much larger than the financial costs. But Cappelli points out that for parents & students the average figures may not mean much, because they disguise enormous differences in outcomes from school to school. He cites a survey, carried out by PayScale for Businessweek in 2012, that showed that students who attend M.I.T., Caltech, & Harvey Mudd College enjoy an annual return of more than 10% on their “investment.” But the survey also found almost 200 colleges where students, on average, never fully recouped the costs of their education. “The big news about the payoff from college should be the incredible variation in it across colleges,” Cappelli writes. “Looking at the actual return on the costs of attending college, careful analyses suggest that the payoff from many college programs—as much as 1 in 4—is actually negative. Incredibly, the schools seem to add nothing to the market value of the students.”
So what purpose does college really serve for students & employers? Before the human-capital theory became so popular, there was another view of higher education—as, in part, a filter, or screening device, that sorted individuals according to their aptitudes & conveyed this information to businesses & other hiring institutions. By completing a four-year degree, students could signal to potential employers that they had a certain level of cognitive competence & could carry out assigned tasks & work in a group setting. But a college education didn’t necessarily imbue students with specific work skills that employers needed, or make them more productive.
Kenneth Arrow, one of the giants of twentieth-century economics, came up with this account, & if you take it seriously you can’t assume that it’s always a good thing to persuade more people to go to college. If almost everybody has a college degree, getting one doesn’t differentiate you from the pack. To get the job you want, you might have to go to a fancy (& expensive) college, or get a higher degree. Education turns into an arms race, which primarily benefits the arms manufacturers—in this case, colleges & universities.
The screening model isn’t very fashionable these days, partly because it seems perverse to suggest that education doesn’t boost productivity. But there’s quite a bit of evidence that seems to support Arrow’s theory. In recent years, more jobs have come to demand a college degree as an entry requirement, even though the demands of the jobs haven’t changed much. Some nursing positions are on the list, along with jobs for executive secretaries, salespeople, & distribution managers. According to one study, just 20% of executive assistants & insurance-claims clerks have college degrees but more than 45% of the job openings in the field require one. “This suggests that employers may be relying on a B.A. as a broad recruitment filter that may or may not correspond to specific capabilities needed to do the job,” the study concluded.
It is well established that students who go to elite colleges tend to earn more than graduates of less selective institutions. But is this because Harvard & Princeton do a better job of teaching valuable skills than other places, or because employers believe that they get more talented students to begin with? An exercise carried out by Lauren Rivera, of the Kellogg School of Management, at Northwestern, strongly suggests that it’s the latter. Rivera interviewed more than a hundred recruiters from investment banks, law firms, & management consulting firms, & she found that they recruited almost exclusively from the very top-ranked schools, & simply ignored most other applicants. The recruiters didn’t pay much attention to things like grades & majors. “It was not the content of education that elite employers valued but rather its prestige,” Rivera concluded.
If higher education serves primarily as a sorting mechanism, that might help explain another disturbing development: the tendency of many college graduates to take jobs that don’t require college degrees. Practically everyone seems to know a well-educated young person who is working in a bar or a mundane clerical job, because he or she can’t find anything better. Doubtless, the Great Recession & its aftermath are partly to blame. But something deeper, & more lasting, also seems to be happening.
In the Goldin-Katz view of things, technological progress generates an ever-increasing need for highly educated, highly skilled workers. But, beginning in about 2000, for reasons that are still not fully understood, the pace of job creation in high-paying, highly skilled fields slowed significantly. To demonstrate this, 3 Canadian economists, Paul Beaudry, David A. Green, & Benjamin M. Sand, divided the US workforce into a hundred occupations, ranked by their average wages, & looked at how employment has changed in each category. Since 2000, the economists showed, the demand for highly educated workers declined, while job growth in low-paying occupations increased strongly. “High-skilled workers have moved down the occupational ladder and have begun to perform jobs traditionally performed by lower-skilled workers,” they concluded, thus “pushing low-skilled workers even further down the occupational ladder.”
Increasingly, the competition for jobs is taking place in areas of the labor market where college graduates didn’t previously tend to compete. As Beaudry, Green, & Sand put it, “having a B.A. is less about obtaining access to high paying managerial and technology jobs and more about beating out less educated workers for the Barista or clerical job.” Even many graduates in science, technology, engineering, & mathematics—the so-called STEM subjects, which receive so much official encouragement—are having a tough time getting the jobs they’d like. Cappelli reports that only about a fifth of recent graduates with STEM degrees got jobs that made use of that training. “The evidence for recent grads suggests clearly that there is no overall shortage of STEM grads,” he writes.
Why is this happening? The short answer is that nobody knows for sure. One theory is that corporate cost-cutting, having thinned the ranks of workers on the factory floor & in routine office jobs, is now targeting supervisors, managers, & other highly educated people. Another theory is that technological progress, after favoring highly educated workers for a long time, is now turning on them. With rapid advances in processing power, data analysis, voice recognition, & other forms of artificial intelligence, computers can perform tasks that were previously carried out by college graduates, such as analyzing trends, translating foreign-language documents, & filing tax returns. In “The Second Machine Age” (Norton), the M.I.T. professors Erik Brynjolfsson & Andrew McAfee sketch a future where computers will start replacing doctors, lawyers, & many other highly educated professionals. “As digital labor becomes more pervasive, capable, and powerful,” they write, “companies will be increasingly unwilling to pay people wages that they’ll accept, and that will allow them to maintain the standard of living to which they’ve been accustomed.”
Cappelli stresses the change in corporate hiring patterns. In the old days, Fortune 500 companies such as General Motors, Citigroup, & I.B.M. took on large numbers of college graduates & trained them for a lifetime at the company. But corporations now invest less in education & training, &, instead of promoting someone, or finding someone in the company to fill a specialized role, they tend to hire from outside. Grooming the next generation of leadership is much less of a concern. “What employers want from college graduates now is the same thing they want from applicants who have been out of school for years, and that is job skills and the ability to contribute now,” Cappelli writes. “That change is fundamental, and it is the reason that getting a good job out of college is now such a challenge.”
Obtaining a vocational degree or certificate is one strategy that many students employ to make themselves attractive to employers, &, on the face of it, this seems sensible. If you’d like to be a radiology technician, shouldn’t you get a B.A. in radiology? If you want to run a bakery, why not apply to Kansas State & sign up for that major in Bakery Science? But narrowly focussed degrees are risky. “If you graduate in a year when gambling is up and the casinos like your casino management degree, you probably have hit it big,” Cappelli writes. “If they aren’t hiring when you graduate, you may be even worse off getting a first job with that degree anywhere else precisely because it was so tuned to that group of employers.” During the dot-com era, enrollment in computer-science & information-technology programs rose sharply. After the bursting of the stock-market bubble, many of these graduates couldn’t find work. “Employers who say that we need more engineers or IT grads are not promising to hire them when they graduate in four years,” Cappelli notes. “Pushing kids into a field like health care because someone believes there is a need there now will not guarantee that they all get jobs &, if they do, that those jobs will be as good as workers in that field have now.”
So what’s the solution? Some people believe that online learning will provide a viable low-cost alternative to a live-in college education. ... Another approach is to direct more students & resources to two-year community colleges & other educational institutions that cost less than four-year colleges. President Obama recently called for all qualified high-school students to be guaranteed a place in community college, & for tuition fees to be eliminated. Such policies would reverse recent history. In a new book, “Learning by Doing: The Real Connection between Innovation, Wages, and Wealth” (Yale), James Bessen, a technology entrepreneur who also teaches at Boston University School of Law, points out that “the policy trend over the last decade has been to starve community colleges in order to feed four-year colleges, especially private research universities.” Some of the discrepancies are glaring. Richard Vedder, who teaches economics at Ohio University, calculated that in 2010 Princeton, which had an endowment of close to fifteen billion dollars, received state & federal benefits equivalent to roughly 50,000 dollars per student, whereas the nearby College of New Jersey got benefits of just 2,000 dollars per student. There are sound reasons for rewarding excellence & sponsoring institutions that do important scientific research. But is a twenty-five-to-one difference in government support really justified?
Perhaps the strongest argument for caring about higher education is that it can increase social mobility, regardless of whether the human-capital theory or the signalling theory is correct. A recent study by researchers at the Federal Reserve Bank of San Francisco showed that children who are born into households in the poorest fifth of the income distribution are 6 times as likely to reach the top fifth if they graduate from college. Providing access to college for more kids from deprived backgrounds helps nurture talents that might otherwise go to waste, & it’s the right thing to do. (Of course, if college attendance were practically universal, having a degree would send a weaker signal to employers.) But increasing the number of graduates seems unlikely to reverse the over-all decline of high-paying jobs, & it won’t resolve the income-inequality problem, either. As the economist Lawrence Summers & two colleagues showed in a recent simulation, even if we magically summoned up college degrees for a tenth of all the working-age American men who don’t have them—by historical standards, a big boost in college-graduation rates—we’d scarcely change the existing concentration of income at the very top of the earnings distribution, where C.E.O.s & hedge-fund managers live.
Being more realistic about the role that college degrees play would help families & politicians make better choices. It could also help us appreciate the actual merits of a traditional broad-based education, often called a liberal-arts education, rather than trying to reduce everything to an economic cost-benefit analysis. “To be clear, the idea is not that there will be a big financial payoff to a liberal arts degree,” Cappelli writes. “It is that there is no guarantee of a payoff from very practical, work-based degrees either, yet that is all those degrees promise. For liberal arts, the claim is different and seems more accurate, that it will enrich your life and provide lessons that extend beyond any individual job. There are centuries of experience providing support for that notion.”
Labels:
Africa,
Asia,
college,
degree,
economy,
education,
employment,
Europe,
graduate,
jobs,
latin america,
meritocracy,
networking,
North America,
robotics,
school,
student,
technology,
university,
world
Wednesday, October 28, 2015
"21st Century Beggars" by Gatis Sluka
Sunday, October 25, 2015
Paranoia (movie line 1)
Especially true for our "modern" age of smartphones & social media.
Labels:
Africa,
Asia,
Australia,
Europe,
government,
Harrison Ford,
internet,
Liam Hemsworth,
modern,
movie,
North America,
paranoia,
people,
privacy,
public,
smartphone,
social media,
south America,
technology,
world
Thursday, October 15, 2015
Paranoia (movie quote 1)
Loved this line from the movie, Paranoia. By the way, Picasso actually didn't start this expression. But, regardless of who started this, this movie line exemplifies today's "innovation". Everyone in today's corporate & tech world loves talking about "innovation" but nobody is coming up with anything new or original. Heck, even the late Steve Jobs reiterated this Picasso line multiple times, & he is considered one of the greatest innovators of our modern tech age.
So, then the question arises: why our education system is so much against plagiarism when the corporate world, around the world, has its hands & heads deep in plagiarising someone else's idea?
Labels:
education,
Gary Oldman,
humanity,
idea,
innovation,
Liam Hemsworth,
movie,
paranoia,
people,
Picasso,
plagiarism,
quote,
smartphone,
society,
stealing,
Steve Jobs,
student,
tech,
technology,
world
Thursday, August 6, 2015
"Search Engine of the Future" by Jeff Koterba
Wednesday, July 29, 2015
Criminal Minds, S1E10 (quote 1)
This is my interpretation of this great quote by Sir Peter Ustinov.
When someone keeps pursuing their dreams, they keep raising the bar / stakes to achieve ever more of their dreams, to the point that they start to step onto others' dreams & start turning them into nightmares.
For instance, Steve Jobs had a dream of combining humans with beautiful & brilliant machines; to make human lives easier & more comfortable. But after achieving his first dream of making Apple 1, he raised the bar & moved on to his next dream of making Apple 2. Then, after achieving that dream, he moved on to his next dream of making Macintosh. After a few hiccups in his career, he then moved on to those colourful Macs, then iPods, iPhones, & iPads.
Reading / listening this story of a college-dropout-turned-entrepreneur is very inspiring for people around the world. But we forget how his "super abundance of dreams" turned the dreams of millions around the world into nightmares.
For example, cameras in iPhones (& then other OS based phones) destroyed the photographing film & camera businesses, like Kodak, for instance. Kodak employees were thinking of one day retiring into the sunset & seeing their kids go to colleges / universities & then be successful in their lives, but their dreams turned into nightmares, when Kodak went bankrupt.
We can take this example into any other latest tech entrepreneur's dreams & how his/her dreams created nightmares for millions of others; dreams of Travis Kalanick (Uber's CEO) & nightmares of taxi drivers, dreams of Kevin Systrom & Mike Krieger (Founders of Instagram) & nightmares of employees of photographic film & camera companies) etc.
We can also apply this quote to any one of the industries from tobacco to oil & gas to defence & military to financial services to even geopolitical affairs. Companies in all these industries, & politicians in the geopolitical arena, are trying to achieve the dreams of owners (single owner or multiple shareholders), management, employees, & politicians at all levels, at the expense of creating nightmares for millions around the world with climate change problems, wars, austerity measures, & adverse health conditions (i.e. cancers etc.).
Labels:
climate change,
Criminal Minds,
defence,
dreams,
financial,
fossil fuels,
health,
Jason Gideon,
jobs,
nightmares,
people,
politics,
quote,
Sir Peter Ustinov,
smartphone,
Steve Jobs,
technology,
tobacco,
TV,
wars
Tuesday, July 28, 2015
The 40-hour work week is a thing of the past
This blog post confirms my opinion that there's no such thing as "work-life balance" any more. You are expected to work 24 hours a day & then some more. Heck, if there would've been 48 hours in the day, then we humans would've been required to work 48 hours.
It's ironic that how humans always create problems for themselves by themselves. Computers were invented by us, humans, to relieve us from work & have more "work-life balance," but now, you are considered as a good, diligent worker, if you are working with the same speed & energy, as the damn machine.
Now, the next level of machine automation is AI (Artificial Intelligence) & Internet of Things, where machines can talk to one another, & perhaps, perform & learn things on their own, freeing the humans to do other more strategic things.
But, then the question arises that if the machine is learning on its own & we all know that it can learn much more in quantity at a much better speed, then where does that leave us, humans? A human brain can't compete with a processor in speed & memory, esp. when it doesn't even need a human to input data in it; it is learning on its own.
Why would a business, which will of course, always try to reduce its costs, through efficiency & effective procedures, will hire people to crunch data or perform accounting work or draw engineering drawings & etc. etc.? Robotics & machines equipped with AI can & will do all that work & much more at a much faster speed at fraction of a cost of a human, & with much more efficiency & effectively.
What will be happening on the streets of developed countries then, when millions of young & old, who spent ages in studying & getting degrees, are unemployed & have no money to put food on their dinner tables? I think anyone can imagine the chaos in the cities, then.
At the moment, it is a mere "inconvenience" to work more than 40 hours for people, compared to where our society is headed. Then, working 50, 60, or 70 hours will seem nothing when your choices will be (assuming a worker will have a choice) to either work at the same speed as that machine at a much lower salary (hey, that machine doesn't even need any salary to support anyone) or leave the company.
---------------------------------------------------------------------------------
The phrase “nine to five” is becoming an anachronism.
About half of all managers work more than 40 hours a week, according to a new survey from tax & consulting firm EY, & 39% report that their hours have increased in the past 5 years. Little wonder, then, that one-third of workers say it’s getting more difficult to balance work & life.
The survey, which fielded opinions from 9,699 full-time employees in 8 countries, raises some questions about the sustainability of the current pace of work, said Karyn Twaronite, who heads up diversity & inclusion efforts for EY & commissioned the study.
Employees report that their responsibilities at work have increased while wages have largely stayed flat. And while technologies like company-provided smartphones & remote-work software have bought workers some flexibility, they also keep “people tied to work 7 days a week,” Ms. Twaronite noted.
58% of managers in the US report working more than 40 hours a week, surpassed only by managers in Mexico, where 61% say they’re working those hours. By comparison, just over a third of UK managers & under a fifth of managers in China report working beyond 40 hours.
The reported shift in working hours appears to hit parents particularly hard. Some 41% of managers who have kids say they’ve seen their hours increase in the last 5 years, as compared to 37% of managers who do not have children. Working women & parents also rated the task of managing their work & personal lives as slightly more difficult than men & those without children, but respondents of both genders & all generations reported that they’re feeling the crunch. (That study also had some surprising findings about the Millennial generation as working parents.)
What’s making it so hard to navigate career & family? Participants blame flat salaries & rising expenses, along with the increased workload. Managers in the US say they have a hard time getting enough sleep, finding time for themselves & handling more responsibility.
That finding suggests corporate leaders need to think more about employees’ well-being, Ms. Twaronite said.
“There really isn’t any downtime any longer where people could sign off for the day & be done,” she said. “You can be done for the day but it will be morning in China & you need to be responsive to that.”
Some companies tout flexible scheduling–letting workers leave early or take off Fridays, for example—as one remedy. But some US workers say flex arrangements are an imperfect solution. Some 9% said that they have “suffered a negative consequence as a result of having a flexible work schedule,” such as being passed over for a promotion or losing a job.
It's ironic that how humans always create problems for themselves by themselves. Computers were invented by us, humans, to relieve us from work & have more "work-life balance," but now, you are considered as a good, diligent worker, if you are working with the same speed & energy, as the damn machine.
Now, the next level of machine automation is AI (Artificial Intelligence) & Internet of Things, where machines can talk to one another, & perhaps, perform & learn things on their own, freeing the humans to do other more strategic things.
But, then the question arises that if the machine is learning on its own & we all know that it can learn much more in quantity at a much better speed, then where does that leave us, humans? A human brain can't compete with a processor in speed & memory, esp. when it doesn't even need a human to input data in it; it is learning on its own.
Why would a business, which will of course, always try to reduce its costs, through efficiency & effective procedures, will hire people to crunch data or perform accounting work or draw engineering drawings & etc. etc.? Robotics & machines equipped with AI can & will do all that work & much more at a much faster speed at fraction of a cost of a human, & with much more efficiency & effectively.
What will be happening on the streets of developed countries then, when millions of young & old, who spent ages in studying & getting degrees, are unemployed & have no money to put food on their dinner tables? I think anyone can imagine the chaos in the cities, then.
At the moment, it is a mere "inconvenience" to work more than 40 hours for people, compared to where our society is headed. Then, working 50, 60, or 70 hours will seem nothing when your choices will be (assuming a worker will have a choice) to either work at the same speed as that machine at a much lower salary (hey, that machine doesn't even need any salary to support anyone) or leave the company.
---------------------------------------------------------------------------------
The phrase “nine to five” is becoming an anachronism.
About half of all managers work more than 40 hours a week, according to a new survey from tax & consulting firm EY, & 39% report that their hours have increased in the past 5 years. Little wonder, then, that one-third of workers say it’s getting more difficult to balance work & life.
The survey, which fielded opinions from 9,699 full-time employees in 8 countries, raises some questions about the sustainability of the current pace of work, said Karyn Twaronite, who heads up diversity & inclusion efforts for EY & commissioned the study.
Employees report that their responsibilities at work have increased while wages have largely stayed flat. And while technologies like company-provided smartphones & remote-work software have bought workers some flexibility, they also keep “people tied to work 7 days a week,” Ms. Twaronite noted.
58% of managers in the US report working more than 40 hours a week, surpassed only by managers in Mexico, where 61% say they’re working those hours. By comparison, just over a third of UK managers & under a fifth of managers in China report working beyond 40 hours.
The reported shift in working hours appears to hit parents particularly hard. Some 41% of managers who have kids say they’ve seen their hours increase in the last 5 years, as compared to 37% of managers who do not have children. Working women & parents also rated the task of managing their work & personal lives as slightly more difficult than men & those without children, but respondents of both genders & all generations reported that they’re feeling the crunch. (That study also had some surprising findings about the Millennial generation as working parents.)
What’s making it so hard to navigate career & family? Participants blame flat salaries & rising expenses, along with the increased workload. Managers in the US say they have a hard time getting enough sleep, finding time for themselves & handling more responsibility.
That finding suggests corporate leaders need to think more about employees’ well-being, Ms. Twaronite said.
“There really isn’t any downtime any longer where people could sign off for the day & be done,” she said. “You can be done for the day but it will be morning in China & you need to be responsive to that.”
Some companies tout flexible scheduling–letting workers leave early or take off Fridays, for example—as one remedy. But some US workers say flex arrangements are an imperfect solution. Some 9% said that they have “suffered a negative consequence as a result of having a flexible work schedule,” such as being passed over for a promotion or losing a job.
Labels:
AI,
Artificial Intelligence,
automation,
business,
computer,
economy,
education,
future,
jobs,
labour,
machines,
modern world,
people,
public,
robot,
robotics,
smartphones,
technology,
work,
work life balance
Subscribe to:
Posts (Atom)