Really solid reasoning and we should listen to Burry.
WFH will reduce middle management and "prestige" employees as more efficient workflows evolve from the old "filing cabinets and typewriters" methods that some companies are still using to this day.
Automation is coming, and it is definitely coming for the white collar workers who act as human CRUD apps.
The ability to collate, search, and analyze data used to require labor power, it does not anymore.
I watch my partner's experience in accounting and the difference are stark.
There are many many bs steps in accounting with paper that disappear when digital tools are used.
Having physical pieces of paper as a fungible business tool is still something used by a lot of firms but they are simply behind the times, there are relatively simple software solutions for most paperwork needs and this reduces labor power required.
"We've always done it this way" is an excuse for firms that will fall behind other digitally empowered firms.
What does this mean for investors? Don't invest in companies that insist on using filing cabinets.
> Automation is coming, and it is definitely coming for the white collar workers who act as human CRUD apps.
I know some people who do this kind of work in various organizations.
As far as I (and they) can tell it all should have been automated down to 1/10 or less the staff it takes now decades ago with computers, but their orgs keep getting basically scammed by vendors promising the moon and delivering a mimeograph of a smeared Xerox of a bad photo of the moon, so it never happens—there's constant tool churn, but nothing ever gets faster or better. Anyone internal smart enough to automate any of it themselves either quietly does it some for them and their buddies so they can slack more, or does none at all and tells no-one what they're capable of because it'll just mean more work and management'll probably fuck it up anyway, if not be angry about it.
It's a leadership failure and it seems to be more common than not. Maybe we'll reach a tipping point where such orgs simply die from that kind of thing, but it really seems like we ought to have by now considering how much "low hanging fruit" that couldn't been tackled in the 80s is still around.
Hilarious example: our org decided to "optimise" travel expenses by mandating all our bookings through a centralised travel service. The service offers a corporate portal that allows orgs to specify their policy and ensure compliance while letting employees "self-serve" to book travel.
Unfortunately the software is so bad and so complex that upon deployment, they realised they couldn't just roll this out for direct access to staff. It is full of travel-industry terminology nobody understands combined with corporate org policy terminology few understand.
So they designated specific staff as "travel managers" who would be the ones to book travel for their group. These people then get special training etc. In practices however none of the managers have time for this so we are all delegating it to admin staff that already do other admin type work for our teams.
And so the whole exercise has brought us full circle to where dedicated staff are effectively manually booking travel for us. And of course then COVID hit so nobody traveled for 2 years after that and we all prefer remote / zoom as much as possible anyway.
Ahh I see you too use Egencia. .. fucking useless piece of shit.
10,000 person tier 1 tech company, 80% of the staff are software engineers, so let’s be conservative and say the average cost of an hour of each employee’s time is $100.
I’ve got a great idea, lets implement a tool that means it takes a minimum of 4 hours to complete the process of booking flights, so that we can better enforce budget policies and make sure people don’t spend an extra $50 on flights.
Let’s also ignore the fact that the “cheaper” flight the tool makes you book is several hours different timing to the one you wanted, meaning you’re now losing nearly 8 fucking hours of productivity…
It was announced that we were switching to Egencia, but then it was cancelled after feedback from user testing. Since all the other business apps we use are frustratingly shitty and broken in obvious ways yet nobody bats an eye, we all inferred that Egencia must be a truly special kind of dumpster fire.
"Let’s also ignore the fact that the “cheaper” flight the tool makes you book is several hours different timing to the one you wanted, meaning you’re now losing nearly 8 fucking hours of productivity…
Yeah… but you saved $50 on the flight…"
How does a cheaper flight lose productivity, I have never been flying or in a plane before.
8 hours is a work day. He means that you have to take the flight one day earlier (e.g. because the flight arrives three hours too late) which means that you lost an entire work day on poor scheduling.
Most likely it has several stops and no layover, direct flights can be expensive but ones with layovers can be cheaper. Longer, risk of cancellation and higher risk overall.
Then also red-eyes, it's hard to sleep on a plane, so you may get to somewhere in the morning but if you didn't have a full nights rest, what's the point? You're exhausted.
That's very interesting. Would it be okay for me to ask how it is possible that you have never been on a plane? Is it due to you being very young, or afraid of flying, or living in a part of the world where flying is uncommon?
I recently used concur, and by 'used' I mean I clicked to a page that said "our servers are down" and then I called the travel agent directly who then booked my travel over the phone.
I think concur depends on the implementation - at one job I had to fax in receipts and was painful, at another they had an email gateway configured where you just took a photo of the receipt and emailed it, and it did OCR and imported all the receipts automatically.
I've worked on an automation tool that has a dedicated team to work with it and all the other self service tools.
It's really not a worthwhile effort getting everyone to learn the domain specifics, and a few domain experts to translate works pretty well, despite looking silly.
The idea "this could all be easily automated so those who don't automate it will go out of business" presumes efficient market theory is true. Markets are not actually efficient. Markets are based on relationships between human beings. And a lot of human beings have a vested interest in things staying exactly the way they are.
that is why i automate some things but in general "keep my head down and my mouth shut". the people who get all angsty and revolutionary about this stuff tend to get fired / laid off / asked to leave / burned out. i know because i used to be one of them.
the people who survive are quiet and don't complain about "leadership failure" as you put it. we do not get paid to point out failures of our leadership. or to do things which would by their very existence imply a failure of leadership. we get paid to do what leadership tells us. our continued supply of health care, shelter, and food depend on it.
if i want to develop my talent / skill, well that is in my free time. i can build robots in my basement and play around with new algorithms and nobody will get mad at me or tell me to stop. now you'll have to excuse me, there is an STM32 board calling my name.
Haha you know I used to move from company to company looking for the one that would make my 60 hours a week pay off. Ultimately I cared about things that these companies did not.
My employers and I were much happier when I switched to 30 hour work weeks and went to grad school part time :P
YES so much this. It's really not hard to see why so many people are doing resume driven development, everyone is overworked and understimulated. Nothing matters, the business doesn't care about good engineering, there are so few interesting problems, ugh.
It's often cheaper for big companies to operate a human meat grinder that churns and burns cheap labor than to employ engineers to automate and monitor manual processes.
I've never been convinced that efficiency and automation would result in fewer jobs.
Imagine an economy where there were no ATMs. Any time you wanted cash, you would have to walk to the bank and wait in line to talk to a bank teller and cash a check. And every week you'd be paid by a check. Paper records were collected for everything and an army of employees had to reconcile the records. No spreadsheet software was available.
In fact, the number of bank tellers (not even bank employees) actually doubled since the advent of ATMS:
> The number of human bank tellers in the United States increased from approximately 300,000 in 1970 to approximately 600,000 in 2010. Counter-intuitively, a contributing factor may be the introduction of automated teller machines. ATMs let a branch operate with fewer tellers, making it cheaper for banks to open more branches. This likely resulted in more tellers being hired to handle non-automated tasks, but further automation and online banking may reverse this increase.
But somehow today more people work at banks than in the past, even accounting for population growth. Why would CRUD apps that make some stuff more efficient change this?
All these people theorizing the end of work have never actually worked in a large company, and don’t understand the motivations of people making hiring decisions.
The larger a company gets, the more careerists it attracts. For careerists, “number of people managed” is a key performance indicator (and a boardroom/resume bragging right).
The human motivations of decision makers - to advance their own careers by being the manager that manages more people - is completely misaligned with these future efficiency hypotheses
As long as that remains true, we will continue to see bloat and companies hiring way more people than they need.
To go from Sr. Manager to Director requires having experiencing managing managers who report to you. How do you get more middle managers? You hire more people.
How do you go from Sr. Manager to Manager, you manage a large team, and create a need to have your org have multiple managers.
So yes, lots of perverse incentives to grow your org so-as to grow your career.
I see that in places I've worked. I've always resisted growing my team's headcount as I've seen it break efficiency first hand. However I'm well aware it makes our team less noticed, and less attractive when other companies ask what's my current reporting count as a metric of how senior you are.
You must be me. I have a very strong technical team, and our work is foundational for the product. But it is harder to convey that impact in interviews, compared to saying you have a larger team and manage a large budget. I feel companies don’t value efficiency.
Right, so more efficiencies could free up resources to hire even more middle managers or administrative staff. It's only natural that well funded startups will have roles for "experience coordinator" or something. You think FAANG would have such a large headcount if they didn't have crazy margins?
There are some other tides working in the favor of bloated salaries and payrolls. One is the focus of "stakeholder capitalism". Most of the large companies are held by ETFs. The large banks that run the ETFs are less concerned about individual efficiencies of any one company (they own them all anyway), but are more interested in other things (green initiatives, social causes, etc). So the decisions being made at these companies are less about profit and more about servicing "stakeholders", be it their employees, the community, the environment. That's all fine, but it's difficult to track and hold executives responsible. A bloated staff is fine because you don't want to lay off employees. Or various money losing departments are okay because they're servicing the community.
If there's anything folks like more than reports, its money. Like the savings and goosed stock price from offshoring, especially "offshoring to local robots," which are even more reliable.
Counterpoint: if the company actually had good upper management, they will only let the mid level managers grow their team if it brings value to the company.
No org is perfect, but I keep seeing this take again and again and again, yet rarely any form of acknowledgment that it could be good for the company to incentivize proactively growing a team
> But somehow today more people work at banks than in the past, even accounting for population growth.
Perhaps that's not quite the right comparison - even though the population hasnt doubled since 1970, the number of people employed has (80M->160M, approx [0]). Adjusting for that, Bank Teller employment level is basically unchanged.
The invention of things like saddles and carriages only made horses more and more valuable over the years, until automobiles became a more practical replacement.
I think it is hubris to say that anything a human can do cannot eventually be replaced by a machine. When we will actually reach that point is the real question.
Banks in the US are not a good example because they are several decades behind the rest of the world. I haven't stepped purposely into a bank except to open an account, in about twenty years.
Fednow will be great for most domestic transfers but I still expect any international ones to require either a third party service (Wise, Xoom, etc) or a call/visit to the bank.
It does (at least when I last looked at FedNow ISO 20022) but that still only handles things until the first out of States routing institution, which means I don’t expect personally (from my experience in the field) for wire transfers and domestic sends to exist on the entirely same rail and payment forms.
Automation does not remove jobs. It simplifies them, and makes them lower skill and thus lower wage jobs. A bank teller today is a terrible job making terrible wages. It has basically no career trajectory. There may be more of them, but they’re not the same job at all.
The number of tellers is expected to drop by 12% over the next decade while general jobs are expected to grow by 5% according to government forecasts btw.
Being a cashier used to be a tougher job because you needed to know all the inventory. Now it’s ok for a grocery store cashier to not know what parsley is.
If automation makes it practical to do an embarrassingly small task in order to provide a value to someone richer, yeah, you can be hired into it. Maybe a lot of people will. But that’s not prosperity that’s capitalism holding you by the balls.
This reads to me that only being capable of doing an embarrassingly small task is the problem. If that’s your ability, you’d prefer a world where that’s possible to one where the minimum floor for ability is higher than your own.
Yes, on average, people would prefer to be more capable rather than less.
Not everyone is equally capable, which implies that some people are on the left side of the distribution. Some jobs that simplify the essential actions bring the job to meet the capabilities of the worker.
It’s a misleading argument. People can be trained on the job for many things and that’s how it’s historically worked.
Lowering the bar is not needed or helpful. People who are so incapable that these jobs are their peak potential should probably just be on social programs.
Why should X% of people be on social programs if 75% of them can type a number from the parsley tag into the computer (or scan the tag on the parsley)? Isn't it a more dignified life to have a purpose of helping your fellow humans select and pay for their groceries than to sit around being paid by a social program because we didn't want to make cashiers' jobs more automated? I think for those people lowering the bar is helpful to their psyche.
(I had a family member, since passed, who was significantly mentally handicapped by a difficult birth sequence. She loved nothing more than to feel like she was included and being helpful to the limits of her ability.)
> But somehow today more people work at banks than in the past, even accounting for population growth. Why would CRUD apps that make some stuff more efficient change this?
Population growth doesn't account for regulatory changes and it takes a while to automate anything that interfaces with the law because of the inherent risk in getting it wrong. Banks are acutely exposed to this because they're often the ones who are ultimately responsible - they're the default insurance policy against counterparty risk.
Sure, that makes sense when talking about employee count in aggregate. But specifically the number of bank tellers has doubled. Surely bank tellers should have more tools at their disposal to be more efficient, and surely fewer people would use bank tellers with things like ATMs and online banking. But somehow we have more bank tellers.
Tellers could have gotten twice as efficient and still double in size if the number of transactions requiring human intervention increased four-fold, especially if the move to online banking has driven an increase in fraud and user error. It's a number that scales with regulatory demands, transaction count, total volume, and so on - not number of active users.
Edit: Tellers also tend to be general purpose account managers at smaller banks, so if you called up the bank you'd eventually get forwarded to someone on the floor at your local branch instead of a call center.
Maybe the percentage of people who are unbanked has decreased significantly since ATMs were introduced.
Additionally, people are much more reliant on financial services and payment rails. Credit cards weren't as common (and required for almost everything) 20 years ago. People used cash for probably 10X as many transactions as they do now, so you probably had more 'cash economies'
Branches exist for the most part as places for people to open accounts. Everything else is understaffed and exists as a side effect to get more accounts opened at the bank.
Probably because they've ascertained that one of the key things customers look for in selecting a bank is whether there is a branch near them. Never mind if there is literally anything in there other than an ATM ...
> I've never been convinced that efficiency and automation would result in fewer jobs
I’m not sure I buy into the bank analogy (probably due more to technology making bank products cheaper/more accessible and thus more customers and thus more bankers) but I definitely buy this.
If you believe fundamentally in basic market dynamics and that humans are generally productive creatures, then any automation that comes about will simply shift where the human labor is most useful, and we will generally always have enough human powered work to go around. The tech and automation is just making it more efficient for everyone.
For everything other than $100, $20, & $5 I use a teller. Translates to I use a teller every week or 2. Why? because in many parts of this world cash is still king. Get out of the city for a change. Bring Cash.
How do you do a wire transfer from an atm? Even with a private banker I still have to call someone to move money above X amount - I can’t do everything through a web portal. Maybe my bank just sucks, though.
For how many thousands of years did innovation only increase the employment of horses? Until suddenly their employment suddenly dropped, and hasn't increased in a century.
I don't really find this convincing. Companies have been transitioning from paper for 20 years, and most have superseded it. WFH is hardly a step change.
Psycologically for some it is a massive change - having had numerous colleagues in the past who had core values like "if you are not physically sat at your desk you are not working".
On every metric, excluding commuter congestion and possibly office rental rates - no change.
I used to work in process automation. So you're aware, the reason human CRUD apps exist is to handle edge cases & changing context. Often, it takes more time/money to adequately describe the human process than it does to just do it.
You can see this principle with ERP implementations - they can literally bankrupt an entire company if you get too ambitious with what can be automated.
Edge cases & changing business context aren't going away.
Bingo! The business I work for specializes in and caters to edge cases. It is why we are profitable and we let the big boys do all of the big easy jobs that aren't as profitable.
In my experience, it's the exact opposite. WFH creates more of a fog of war, so companies end up hiring more managers to stay on top of it.
Some employees are great at self-directed WFH, but many others struggle without the physical, social context of the office to get them focused. Again, companies rely on more managers to compensate for those employees who struggle more with WFH.
I know people will say "Just hire people who are good at WFH", but if we transition large amounts of the workforce to WFH then you can no longer pick-and-choose just the good WFH people. You have to deal with the realities of managing WFH at scale.
What about Junior ICs who need a lot of mentorship and coaching? They are a lot more in number (twice that of Senior ICs in my observation) and almost everyone uniformly asks for more in-person events and more mentors.
In fairness Junior ICs also miss the collegiate atmosphere of going out with all their work friends after the day wraps ;) I was a Junior IC with smart attractive colleagues once…
It is a real issue though and the answer isn’t obvious. I tend to think that something like A16Z’s model will wind up as the end state: modest amounts of flex office space, geographically distributed, with frequent, family-friendly off sites.
Pair-programming software infrastructure will have to improve, and maintaining a hub of urban office space where Senior ICs get a COL bump to be near the young folks will probably be part of it.
It’s a new world, and there will be an adjustment, but there’s enough critical people who simply aren’t going back that it’ll be management’s job to figure it out. I imagine figuring it out well will pay commensurately well.
A simple first-order approximation of your first 3 paragraphs - hybrid model. Companies are discovering that offsites are in fact costlier for the value they provide. Hybrid would require an office and hence a higher cost, but it also has a much bigger bang for the buck.
> but there’s enough critical people who simply aren’t going back that it’ll be management’s job to figure it out
Pendulum is swinging, my friend. With a big recession looming, companies will have all the leverage. Elon has already paved the way and other big companies will soon follow. On the other end of the spectrum, a lot of startups will be bootstrapped in this recession and a majority of startup founders want to have the initial bootstrapping team to be together[0].
[0] Based on my first hand observation working as an advisor for a VC firm.
Your point is taken about the macroeconomic situation (I'm not quite convicted enough to go short but it's a near thing, Q3 earnings are looking to be a shit show).
With that said, we've been through this before: companies always have leverage with many/most of their employees, and that is going even further up for awhile, but Google Search doesn't stay up without a meaningful number of people who have made enough money to not be pushed around.
This is the flip side of being one of the "CEO/VC class" and having a boom decade or two: most of the real pros made some serious money on stock and can't be herded like cattle the way new grads can be.
The American economy has become an experiment in how close you can get to indentured servitude for your working people without calling it that by twisting the macroeconomic knobs until the kid doesn't get medicine unless you hump that bullshit job. I'm sure the money guy who figures out how to do that to hardass infrastructure hackers or founder-caliber kids will be hailed as a hero (in private), but it hasn't happened yet alhamdulillah.
Oh I agree, but if attendance is a performance metric then the work going on doesn't need an expensive white collar management infrastructure, which I think is part of Dr Burry's point (even though the article is referencing cooked studies about WFH productivity loss and couching the reference in weasel language).
You seem fairly resolute that knowing if someone has their butt on their seat is useful, but I would invite you to try finding some other metrics that actually measure output, and try rewarding people for performing highly on those metrics. I can tell you, I've seen countless unproductive hours wasted with butts firmly on seats. It's not a good measure of productivity.
Jim is a hard worker, always first in the door and last out at night. - if only you were as committed as Jim, instead of being 2x plan, you would be 4x plan.
Jim sits and watches cat videos on youtube all day - is 0x plan, and will be promoted to being your boss next year.
I work for an organization where all the accounting is digital, has been for decades. Yet every miscellaneous expense reimbursement requires a receipt. That physical receipt has to be submitted with an expense report. The physical receipt is then scanned when the expense report is processed, but it is also filed and has to be kept for 7 years, subject to audit.
It's the most ridiculous thing I've ever seen, so it must either be a legal requirement or maybe the organization had a really bad experience with expense fraud and is now paranoid.
I've wasted more time (and thus salary cost) using Concur than any amount of money the company is 'saving' by making me USE Concur. An asinine, idiotic, cumbersome, SLOW inefficient and often _wrong_ process to book any travel with.
Not sure about US law, but in my country it's fairly similar. The physical copy of the receipt must be kept for 7 years, but "only" for 3 years if you make a digital copy of it.
It's quite a hassle if you WFH and expense things, since now you need to bring the physical receipts to the office.
> The ability to collate, search, and analyze data used to require labor power, it does not anymore.
This is all baloney. There is noone organising the information and documents in "modern" companies. There is just a rolling reset of knowledge with an average lifetime of about three years.
> Having physical pieces of paper as a fungible business tool is still something used by a lot of firms but they are simply behind the times,
They are not behind their times. They are using a system that actually works.
I wish we could just return to the basics for knowledge management. Have bibliographic databases and enforce proper referencing. Have a numbering system to locate things and a standardised classification system. And employ actual people to manage it.
Let’s take ‘a really big bank’ (you know who); I know tons of people there, with very high paying jobs, who do absolutely nothing that cannot be replaced by a few lines of Python. For many years. You don’t need wfh to automate all these people, and their managers away. Yet this has been true for at least a couple of decades and these people are in demand and get hired for serious amounts of money all the time. Same at other (big) banks. You think this is going to end soon? I see 0 moves towards that; at one team I know they just hired yet another manager (who manages over people who fill spreadsheets from emails and mail them out later) because the current managers felt they were overworked, so now they have 1 managers per 3 people. They can cut out the entire department since at least 2012 but probably since 2002.
This has been said over and over. In the 1980's people worried automation would leave lots of people unemployed. And long before in 1931 Keynes [1] predicted we should now be living in a world of abundance. He said our the biggest concern would be how to spend all this free time.
Yet I see people working harder than ever, and burnout is the number one reason for absence of work in many countries.
Every step you automate opens up new possibilities and more employees are needed. I work as a software engineer now for around 15 years. Never have my software reduced costs, because as soon as my software automates something people want more.
Although I'm skeptical, we could really facing a turning point, Keynes made his predictions for 2030, so we'll see :)
[1] Economic Possibilities for Our Grandchildren (1931)
> Yet I see people working harder than ever, and burnout is the number one reason for absence of work in many countries.
I do have a pet-theory that humans aren't well equipped for long-term 9-5 computer thought work. Anecdotally, I see people burning out in < 5 years with thought work careers. People under 30 packing up their careers to go live in a van or whatever it may be.
I really disagree. The past 20 years have shown us that automation is the main driver of white collar job growth.
There are very few industries that haven't already digitized, the ones you mentioned that have human CRUD apps are relatively rare. Accounting, law, healthcare...who else? Basically nobody.
How many new companies exist now because of automation?
Sure, there are some prestige managers out there, but I think their prevalence is overstated. Most managers I know in tech companies are overworked and have no time on their schedule.
"WFH will reduce middle management..." I find that an interesting idea, because at my company the introduction of work from home has increased middle management to herd remote cats. Because you can't just talk anymore, you have to document every last effing thing, and then have interminably long meetings to go over it. I'm skeptical. WFH seems to be a middle manager's dream.
Yeah there is a sense that more pdf certified scrum masters are creeping in everywhere but it soon be visible its all bollox. Any senior dev should be able to manage their time with a nah constantly asking about progress and companies will see little value over time in having these glorified secretaries pollute work processes.
Increasing automation...the increasingly widespread adoption of enterprise software, data analytics, and machine learnings are increasingly rendering office jobs moot as much of the traditional number crunching, document processing, and repetitive analysis performed by office professionals is now being performed by software.
This is not even wrong, in the W. Pauli sense. Just one of the many reasons is enterprise software works less and less for the enterprise that 'rents' it and more and more for the enterprise that controls it.
Everyone hates enterprise software who is on the receiving end of it.
As for the accounting example DM5 uses, ask his partner about when three people have to co-ordinate a fixing a single ppv in a single po. It can take days to fix a single mistake. Meahwhile dozens or hundreds more po's have piled up.
po = purchase order, ppv = purchase price variance 3 people = seperation of duties, because people try to steal.
But then they counterbalance this with the introduction of efficiency sapping regulations and compliance which adds little benefit but lots of overhead/headcount. Let's look at University Administration... instead of automation REDUCING administrative headcount, it has gone UP!!
That just sounds like the company is (badly) lagging frankly - almost all the companies I worked with phased out paper around a decade back with stragglers being ~2015
At this stage I'd welcome any automation in the accounting space but efforts I've seen thus far have not been promising. Domain specific software is tricky - either you get SWE in then they don't know the domain, or the opposite then the code is garbage. That gap means endless meetings and committees about requirements and end user testing in practice even for the most trivial of automations.
Outsourcing is way more likely to kill accounting jobs.
There is a lot of room for further development in accounting that is not presently happening because so many people are required to file all the lunch receipts. Shortsighted business might fire their staff when they find out how to buy an OCR system but it would be wiser to explore what you can do with people who can collect and collate complex business data with great reliability.
>Having physical pieces of paper as a fungible business tool is still something used by a lot of firms but they are simply behind the times, there are relatively simple software solutions for most paperwork needs and this reduces labor power required.
Until some genius writes delete from invoices ...
I don't think that we are nearly close enough to have digital recordkeeping done right.
Ha ha nothing will go away as long as there are “agents” and “licenses” and lawyers for all activities. They could all be disrupted , the technology is definitely there but the elephant will not move a inch to make way for robust technology to make it smaller.
This is almost certainly the line of reasoning that lead to COBOL, and yet here we are with guilds and corporations that still need management to relate to the present and future of their trade's procedures and environment to the real world.
Those paper trails are established by laws. Entire job sectors exist solely because of regulation. What is the probability that politicians and lawyers will agree to remove that legislation
Burry is a hedge fund manager. The most important part of his job is to scare people away from the S&P and entice them to put money into his fund.
It's generally a mistake to trust the financial advice of any hedge fund manager, and they usually provide advice that makes the S&P seem like a bad bet.
That's a vapid take. If I have information that isn't widely known, and I have established a vested position, then it is always profitable for me to promulgate my information. That doesn't suggest my information is wrong.
What if I don't have information that isn't widely known but I've people listening to me because I was once right in 2008 so now everyone is listening to what I have to say, and then I use it to influence people to do what I want based on what I say?
Couldn't that be other side of the coin?
it's not information being promulgated, but speculation.
Time will tell whether said speculation will turn out to be right. However, if more people believed such speculation, it might make it more likely to be right in the future. Therefore, there's a selfish reason for said person to promulgate such speculation.
That depends on your alternative options. The S&P 500 has averaged 6.6% real returns historically. Better than bonds? sure. Better than unleveraged global real estate? sure. Better than gold? sure. Better than PE? no. Better than leveraged rentals? hell no. Better than investing in productive capacity in your own startup? lol.
No one ever became wealthy from buy and hold index funds. The returns are relatively low because most people are lazy and it's the easy thing to do. That's why I split my time between active investing (https://grizzlybulls.com) and working on my private business ventures.
> No one ever became wealthy from buy and hold index funds.
Nobody ever became poor from it either (unless they panic). At least your money keeps pace with inflation to some extend. It certainly beats the money sitting in your bank account doing nothing (the alternative for most people).
There were long periods were it barely beat inflation and some were it didn’t even do that.
e.g. it took 30 years to recover to 1929 levels, then also 30 years to reach the 1960’s peak, in 2010 it fell to mid 90’s levels (all inflation adjusted).
So yes, you’re right but it depends on how long you can wait. Over 50-60 years you should be fine. 20-30 who knows, if you’ve entered just before a peak it might not recover within your lifetime.
I don't think the expression applies to harder physical limits such as gravity or exists as a statement of epistemological nihilism in most cases. I don't doubt that inductive reasoning is useful however. It's difficult to impossible to use it effectively or consistently profitability to predict market swings if i had to hazard a guess based on the phrase.
Michael Burry has been claiming the sky is falling for more than a year now. In my eyes he is no longer credible. Broken clock and all that.
At this point, the entire globe has fallen in uncharted economic territory. Anyone projecting certainty about what the future holds is a fool, liar, or both.
The bubble caused by a decade of easy money started to burst last year.
Go look at a stock chart from the last 1Y and you’ll see that the sky has started to fall, and had been falling from just after 1Y and has a long long way to go.
Until inflation is under control and fed tightening stops, housing, equities, and bonds will continue to fall in price. This week almost saw a catastrophic run on the pound and UK gilts and US bonds are flashing warning signs.
The crash started last year and continues with mathematical certainty until the fed pivots.
The stock market's price level is well above where it was pre-pandemic, it's trading at earnings multiples in-line with 30+ year averages [1] (what was the 10Y yield in 1990, might I ask?) because earnings have gone up.
I know being doom and gloom makes you sound smart (to some), but don't fall into the trap of 'everything is shit now because inflation has been high for 20 months'
Do you really believe these sky high earnings are going to continue?
And what about all the companies that still have billion dollar market caps but have NEGATIVE earnings? (Atlassian, Snowflake, Uber, Crowdstrike, Shopify.... the list goes on)
Folks we are in a tightening cycle in order to fight inflation. It's feels important to remind everyone that economic conditions like this have not happened in decades. This site (hackernews) has pretty much entirely existed in the era of cheap money and low interest rates.
Keep your head in the sand at your own peril.
Companies are tightening their belts and you should too.
>And what about all the companies that still have billion dollar market caps but have NEGATIVE earnings? (Atlassian, Snowflake, Uber, Crowdstrike, Shopify.... the list goes on)
I think they are an extremely small part of the $100tn global stock market that people here love talking about because they work there.
>Folks we are in a tightening cycle in order to fight inflation.
My point was that we are at multiples in-line with 1990, ya know when we had 8% 10Y yields.
Folks, call the end of the world at your own peril.
> My point was that we are at multiples in-line with 1990, ya know when we had 8% 10Y yields.
Is any part of you worried the S&P is headed 3600 -> 3500 -> 3200 -> 3000 -> 2800 given the current climate? I'm worried that the current P/E ratio (from EPS TTM, past 4 quarters) might be a bit "rosy", and the next 4 quarters of earnings are over-optimistically priced (not enough pain has been factored in/the effects of corporations having to try to grow for 18 months in a 4% interest rate environment hasn't been priced in yet).
Additionally, if the entire world stock market is $100tn, the US is sitting pretty, because at $49tn the US stock markets represent HALF of the entire world stock market.
Hyper inflation more likely. Get a loan while money is cheap. Housing will come back no matter what anyways. But honestly could be 3x their cost today in 10 years. No one is selling anyways.
Wages are moving down in real terms. Many home purchases are made with 40% of gross income presently. Increased (3x) home costs would mean buyers would be paying out an even higher percentage of gross income. Eventually too many buyers are unable to buy housing and the party is over. I think we are already there, so 3x housing prices (and 3x rental prices) in 10 years seems very unlikely.
The burr of the printer will only make the spread between wages and costs even worse. Only fixed rate debt will be eroded by inflation; variable rate and short-term debt will step up to higher rates.
Yes, that's why it has far more to go to revert to the mean of earnings and price. Earnings are also artificially inflated, and will drift down, and stock prices with them, we're just seeing the beginning of that process.
Here's someone else who called it over a year ago with some charts with counterpoint those you link:
Just to be clear, the argument is not 'everything is shit now because inflation', it is that the era of easy money has ended, significant inflation and war means a regime change in central bank intervention, and central banks will now tighten till things break. They haven't even managed to unwind QE yet and stop buying their own debt, will they ever? If they do, watch out.
This isn't doom and gloom, it's simple realism about the regime change in interest rates - the era of 0% money is gone, in its place we have a normalisation, which means a normalisation of frothy asset prices and a reversion to the mean of earnings and prices in many domains. Usually these things take a year or two to work out, it won't be a fast process.
I agree with most of what you wrote, but I think it misses that, in an era of 0% money, growth matters more than current profits and so firms rationally trade off the latter in pursuit of the former.
When that changes, I think you’ll see earnings rise as chasing growth now carries a higher cost.
It's a common trope for people who don't like the fact that the stock market is trading at normal-ish P/E levels to say 'well the E is fake!!! Just wait until the E normalizes!'
Usually, they cite things like artificially low input costs (hard to argue that today), unsustainable demand from customers due to 'cheap money' (haven't seen evidence of that yet, the opposite actually), and a general disdain for advertising and retail business models (despite their long-term durability).
Yes, you are the clever one, everyone else just repeats tropes. Personal debt is at record levels, inflating earnings. That has to unwind at some point and it will be when the cost of money increases, which it will, if the Fed has to trigger a recession to do so. We're in a different regime now.
not to alarm you, but the sp500 is only above the pre pandemic high by about 7.5% while the Dow is below. inflation adjusted, sp500 is below pre pandemic highs.
1) 'Inflation adjust' asset prices (if you think the average stock-holder has anywhere near the spend patterns of the CPI basket, boy howdy do I have something to tell you about home ownership and income inequality: rent is 33% of the CPI bucket and that's an 'imputed cost' for the vast majority of stock-holders who own their homes)
2) Think 7.5% in 2 years is a 'bad return'
EDIT: Let's actually play a fun game, what's the real return of the SPY from the perspective of someone planning on taking a trip to Europe/The UK/Japan?
Earnings and P/E in isolation are not good indicators. They should be at least discounted against the “risk-free” rate of return, which if you haven’t noticed is high relative to the past two decades with further increases to come.
I would imagine most HN posters have never done a DCF analysis in their lives but I know plenty of people who have which have never beat the market. Not sure I buy any model other than Graham-Dodd’s these days.
Sure, that’s probably a better model. I don’t see anything wrong with it and it has more parameters.
I just think P/E by itself is way too oversimplified and a pretty garbage metric when used in aggregate + historically due to things like interest rates, sector/business model skew, relative maturity of companies, etc also fluctuating over time. It’s like Week1 of value investing 101, not an actual metric by which you’d want to engage in value investing or historical analysis.
dot-com occurred in an environment where bond yields were considerably higher than they are now. Yet SP500 P/E wasn’t that much lower and had risen to higher levels than in 2021 despite yields being 3x higher.
Basically, fu Main Street, got mine. I wonder if they believe the meta-awareness the internet has provided will just go away if they crash tech/social media? That’s where all the progressive undesirables work, after all.
Past pols convinced people “trickle down” was sincere economics, not a bawdy joke. That Reaganomic funneling of wealth to the top was for their own good, and the public now blames modern progressives. A pols dedication to double speak is commendable.
Supply side economics has taken more people out of poverty than any kind of ISI or other internally focused development programmes. (I’ll also note I’m a far left person who believes there are many flaws with the global economic system but the raw facts don’t lie. Also, I’ve been poor and destitute unlike the majority of HN posters and understand how desultory and ineffective the majority of social programmes truly are - and this isn’t just an American perspective, as I’ve lived in nearly a dozen countries.)
What’s this have to do with being ruled by rent seekers? They don’t have an absolute plan, but Powell has been projecting his goals for months; stop the rise of wages. Deflate asset prices. That sure sounds like a plan. A vague one, but a plan. Opioid addiction was an intentional plan given the evidence at various trials.
The economy and where possible, the outcomes, are intentional. Cushion “the right people” with free money so they can ride the long hard dip. Gamble with everyone else. It’s all part of the spoken traditions so we readily accept it. The lords of finance demand sacrifice!
Generally I agree that liberal-ish do-goodery interventions are basically hokum, but in the good old days when there was a real socialist project in the form of the Soviet Union, left economics also pulled huge numbers of people out of poverty. The Bolshevik program of "socialism in one country" turned the leftovers of the Tsarist empire, mainly a mass of peasants who had in living memory been serfs, into an industrialized powerhouse with a quite high standard of living.
I'm not a tankie or Stalin apologist but it shows what is possible within a planned economy.
Not sure I agree when we now know the majority of their economic statistics were hokum and the average second world worker earned basically 10%-25% of what a Western European or American worker did. https://cepr.org/voxeu/columns/soviet-economy-1917-1991-its-...
Sure, (most) Russians made less than (most) Americans but they also importantly made more than workers in other parts of the world (the "third world") and even more importantly much much more than Russians were making in 1917. There is also probably a stability dividend at play here -- Soviet jobs were much more stable and so much more desirable even at the same level of pay.
> much much more than Russians were making in 1917
Which is true almost anywhere in the world. Obviously Russia was several decades behind western Europe in 1917. But I’d be very surprised in if the gap between Western Europe and Russia in this regard was considerably higher in 1915 than it was in 1985.
> Soviet jobs were much more stable and so much more desirable even at the same level of pay.
What is this even supposed to mean? It was illegal to not have a job and most people couldn’t freely choose their workplace. Obviously certain positions which provided access to state resources were highly coveted despite only a moderate increase in pay (I don’t think I need to explain why). How is that in anyway
something positive, though?
Right, slave labor can be incredibly efficient in certain circumstances. Plantation owners in the southern states and the Caribbean had already proven that a hundred or more years before Stalin.
It’s not particularly surprising that if you literally work a few million to death and distribute the surplus they created amongst the rest of the population (the one innovation I’ll grant USSR) you can have some impressive growth figures. In fact the more people starve to death or die in the gulags the more per capita productivity increases.
> who had in living memory been serfs, into an industrialized powerhouse with a quite high standard of living.
Right. You can probably say the same about many states in Germany. Russia was just 40 or so years late. It not unreasonable to believe that it’s industrial output would had reached similar levels without the revolution in comparable timeframe (probably with considerably higher inequality but with a magnitude or two less murder, however higher inequality would probably meant that more people would have died from preventable diseases which would potentially offset a million or two who were executed).
Look up "holodomor", "dekulakization", "law of spikelets", and other lovely things. It's possible to get a lot done in a worker's paradise when you can put a bullet in a head.
Is it a bubble or a crash when not accompanied by a huge wave of abrupt layoffs and / or market losses? From where I'm sitting, the current phenomena appear slow moving and unfolding over such a long period of time, it's easily observable and unsurprising.
Maybe we just have a different definition of these terms?
It starts as hiring freezes and funding drying up, it ends in mass layoffs.
In a year or two the crash will be obvious, everyone will be certain investing is a terrible idea, and it’ll be a great time to invest/start a company.
Not all companies need huge investment, and it's a good time to start a company when a lot of companies are dead or dying, and you can step into their market and buy their assets for cheap.
Because by the time you're ready for serious investment the cycle will have turned again. Many great companies are founded in downturns (possibly just because people are laid off, possibly because money stops being a motivating factor).
I've been hearing the same exact thing for 40 years now, about how the economy is going to burst. Sooner or later, someone is going to be right.
I predicted it, too. I saw it coming the exact minute the government starting giving out stimulus and unemployment checks when the pandemic started. Shouldn't I be getting all the credit for realizing this was going to happen a few months into the pandemic.
It doesn't take a genius to figure this whole thing plays out. Burry is just stating the obvious, there's no great amazing calculations made to figure it out.
That one is a mixed bag. There are clearly areas where it is falling - SF, Austin, Boise, a few regions like W Florida... but there are regions where it simply is not (the northeast in particular).
I'd love for that to fall. And let's not talk about rents.
> Through his analysis of mortgage lending practices in 2003 and 2004, he correctly predicted that the real estate bubble would collapse as early as 2007. His research on the values of residential real estate convinced him that subprime mortgages, especially those with "teaser" rates, and the bonds based on these mortgages, would begin losing value when the original rates were replaced by much higher rates, often in as little as two years after initiation. This conclusion led him to short the market by persuading Goldman Sachs and other investment firms to sell him credit default swaps against subprime deals he saw as vulnerable.[14][15][16]
That's the challenge with predictions. If you're off by a day, you'd still be perceived as being right. If the predicted magnitude was 5% off you'd still be perceived as being right. There's no hard and fast rule as far as I can tell about how off you can be and still be considered "right".
Burry was right though? His prediction was that we were in an easy-money fueled asset bubble and we are indeed like 20% down off ATH in the total market, more in tech.
The prediction is based on really simple macro too. Nothing is certain except death and timing a bubble pop/price correction is notoriously hard, especially when it’s based on discrete decisions by the Fed to raise interest rates/stop expanding the balance sheet. But it’s been clear for quite some time that the Fed would have to stop that eventually as stopping inflation became more important than stimulating the economy.
I’d much sooner call the people who thought the loose monetary and fiscal policy of the 2010s (and especially during COVID) was a permanent fixture the fools.
We should come to terms with the fact that central banks simply can’t raise us out of inflation in short order. It disrupts too much and will cascade. Not to mention politicians implementing policy that directly hinders their efforts.
Yeah it feels like our economic systems are patches built on patches built on patches, etc. Like introducing a species to an ecosystem to control an invasive species which becomes one itself and requires the introduction of another species.
Most our code bases look like this too. Why wouldn’t our financial systems become hairballs too?
> Michael Burry has been claiming the sky is falling for more than a year now.
It's been a little longer than that:
2010: UH OH: Michael Burry Agrees With John Paulson Again https://www.businessinsider.com/michael-burry-john-paulson-f... "Michael Burry, one of the first to predict the subprime crisis and bet against it, is now betting on a weak recovery by investing in gold and farmland, two hedges against inflation."
> At this point, the entire globe has fallen in uncharted economic territory.
Howso? We're back in a more historically normal interest rate environment. The last 12 years of zero-interest rates were the "uncharted" territory IMO.
Not a Nate Silver fan, here, but I don't think the "right once" sentiment about him captures his record accurately. What really happened is that millions of people who have an underdeveloped understanding of what it means for a lower-probability event to happen in a single trial are telling on themselves whenever they refer to him as having been "wrong". I mean, if you play only one round of russian roulette — but you lose — it's not really a surprise, is it. There are only six chambers.
Yeah, Silver simply tells us the odds. Someone did the analysis on it at one point and 538 was "right" at about the rate of their percentages.
i.e. For all the times they called an event 65%, it happened about 65% of the time.
So they weren't "wrong" the other 35% of the time, it's just those are the times the 35% chance bore out.
People have a habit of assuming "is likely" is the same as "guaranteed". Or that that's what the caller believe will happen. And then using the event once passed to discredit the initial analysis.
This is exactly why the 2016 election was interesting. Silver called it about 33% chance Trump would win. And then he did, and people said "AHA! You were WRONG!"
But he gave Trump a higher probability of winning than just about anyone, other than Trump himself. One of Silver's detractors, Sam Wang, even said it was so impossible Trump would win that he'd eat a bug if he got more than 240 electoral votes. (he chose a gourmet cricket https://www.politico.com/story/2016/11/sam-wang-poll-expert-...)
Nate and team do modeling well. Fat tails, correlated populations, and they deal with the crap datasets they get from pollsters. I always tune into his podcast when an election rolls around.
> This is exactly why the 2016 election was interesting. Silver called it about 33% chance Trump would win. And then he did, and people said "AHA! You were WRONG!"
What was funny about that one was he also got a bunch of "LOL no way you lunatic, he can't possibly have that large a chance and no-one else is saying so, so you're dumb, or maybe you're secretly a pro-Trump shill".
In that case, please "educate" me on how to understand the quality of a prediction (was it accurate or not), of a binary event (say a Presidential candidate winning election), when both outcomes are both "probable" to a non-zero degree.
For 538 specifically, can start by reading their own evaluations, as written in their helpfully titled "How Good Are FiveThirtyEight Forecasts?"[1]. You can then look up keywords describing their methodology to understand strengths/limitations of those methods.
On a broader scale, what you are asking about is called a "Scoring Rule". Wikipedia, as usual, provides an overview[2]. You can take the mean of a forecaster's score, which allows you to compare forecaster methodologies.
I'm not sure why your comment needed to be so aggressive.
A single prediction? You can't. Multiple predictions? Compare the predicted and the actual frequencies, that is, check whether, for example, (roughly) a third of the events that were predicted to happen with 33% certainty to happen actually happened. It is as bad to be not confident enough as it is to be too confident. The standard term for this is calibration[1,2].
Note that this is the easy part: a sports model that predicted a 50% win for the first team, a weather model that predicted historical averages, or a language model that predicted letter frequencies, would have near-perfect calibration, but would at the same time be pretty useless. The other part is discrimination: how educated your guesses are. That is not so simple to quantify, although the 538 articles above mention some of the possible measures.
That's it for what an uncertain binary prediction means; but why do we want one? Well, if you're betting (literally or figuratively) on an outcome, it probably makes a difference to you whether the "losing" possibility will come up 1% of the time or 40% of the time; but that does not seem that easy to formalize and may feel unsatisfactory.
In that case, here's a formal result.
An always-certain prediction service is obviously equivalent to a deterministic decision rule, which churns some data about the situation and says yes or no based on that. (They are the same thing.) An uncertain prediction service is (less obviously) equivalent to a randomized decision rule, which churns some data about the situation, tosses some (known) coins, and says yes or no based on both. (Take the service's result, output yes or no with the probabilities it gave.) Of course there's always a probabilistic decision rule that performs at least as well as any given deterministic one (run the deterministic rule, choose not to toss any coins, output its result).
It turns out (see e.g. the introduction[3] to Chentsov's monograph[4]) there can be randomized decision rules that are strictly better on average than any possible deterministic rule.
Thank you. I thought I was going slightly insane having someone argue otherwise.
I understand you could evaluate quality over many deterministic predictions, but I'd also presume that those have to all be similar in nature (i.e. all be election outcomes) otherwise there are too many confounding factors if you try and evaluate quality over vastly different prediction calculations.
I don't really understand why people are so obsessed with Nate Silver (particularly negatively). I listen to the 538 podcast and I'm interested in what he has to say about polling (and the discussions are always interesting, even if I don't always agree with them), but that's about it. I also listen to dozens of other podcasts where they talk about other things they know a lot about. It's also not like Nate is the word of god on polling. He's just another source of information. You can take or leave it. It doesn't matter. But why complain about him?
Burry was actually just right again, and in a major way.
He predicted the market would crash this year, and liquidated most of his portfolio in the first half of the year to get out of harm's way. Most of the rest of us are now nursing massive losses.
If you are in finance/hedge funds you would need to be blind to not know that based on all the indicators around the world in the first quarter of 2022 - inflation, war, energy prices, lockdowns in China, data about trade and maritime transport etc.
I also liquidated most of my stock positions based on that data and I'm not in finance.
I don't know about Burry, but Nate Silver has been right more than just once! https://projects.fivethirtyeight.com/checking-our-work/ is a statistical analysis of 538's track record in forecasting elections going back to 2008. The analysis shows that 538's predictions are generally pretty well-calibrated, in the sense that e.g. if there were 100 races where 538 predicted "Democrats have a 70% chance of winning this race", then the Democrats did actually win about 70/100 of those races.
Grantham is a notorious perma-bear, and given that article was posted right before a 20% rise in the market, I'm inclined to say that's another wrong call for him.
There was a movie etc, and people know his name, and how hw knew about the crash, when nobody else did. So it attracts clicks because people want to know this time. Bloggers/journalists are interested in clicks.
people know the names (or movie name in this case) and clicking on a familiar name is probably more likely than a name no one has ever ever heard about.
That was the 1980s and it was PBS, one of the biggest of the few major channels on American airwaves. It wouldn't work today because economists cannot boil down complex ideas into 15 second videos.
Probably for all the wrong reasons. He was pretty widely mocked at LSE when I attended there a decade ago and that’s an institution I’d expect him to still be well regarded at. He traded his academic bonafides years ago.
I've been reading headlines like this for over a decade now. I wonder when it will finally happen.
Also interesting use of the word bubble considering it involves an unprecedented change in how our economy has functioned for the past 3/4 of a century.
"The car dependent American suburb bubble is bursting"
I dunno, didn't he also say there was an imminent water crisis and that index funds were a bubble? Who knows, dude very well may be right in the long term but I don't think they panned out as investments.
Ultimately, he may very well be right but I don't put much more weight into his opinion than any other prognosticator. I feel like the subprime bubble was uniquely suited to his expertise, but stuff like this requires both a deeper understanding of jobs and technology than he has, and is less of an inevitability.
White-collar jobs will boom, but they'll be outsourced. Work from home changes nothing about labor needs only labor location. If anything, work-from-home is a net productivity negative for large, bureaucratic organizations because slacking becomes even easier.
Burry has a history of incorrectly predicting crashes [0]. You predict one crash correctly, and all you see are crashes. Then again… maybe we're in a very slow moving "crash" right now, and he predicted it 4 years ago?
> just let me know when I can easily find good devs
As a good dev looking for a job, I can tell you with 95% certainty that the good devs are not applying to your job because your company is a boring knockoff doing boring things and your job sounds like a boring dead-end where developers will be cordoned off into a little box with nary enough room to turn around while they implement mandates handed down from above by a product owner who never uses the actual product with no opportunity for actually discussing whether this new silly feature will make the product worse or not (it will) while technical debt and cruft accumulates and any attempt to invest in the long-term health of your product will result in a reprimand.
After months of looking at job postings, hiring managers complaining about not finding good devs just sounds like incels complaining about not being able to date hot women. Look inward. Lots of good devs are out there looking. They're just not looking at you.
With an attitude like that it’s no surprise you are still looking. Let’s be honest, the good devs, with good communication and people skills, they already have jobs and their companies will do whatever they can to keep them.
> they already have jobs and their companies will do whatever they can to keep them.
I truly wish I lived in the world you imagine. I've been in the job I'm leaving for 13 years, received top-tier performance reviews every year, constant improvement, very good software, technical lead of one of the most profitable teams in the entire (huge) company. Everyone who has ever worked with me directly has great things to say. Meanwhile the vice presidents of such-and-such that play musical chairs with their titles every 2 years do not care. I have made it very clear what they need to do to keep me (and it's not onerous, in fact it's mutually beneficial for everyone and completely in line with the CEO's vision), and they do not care. I'm not sitting in the right seat; I'm not already buddies with the right VP; so fuck me. I can go get fucked.
It honestly sounds like you either have no real-world experience or you've gotten extremely lucky with the company you found. If it's the latter, do whatever you can to stay there as long as it doesn't turn to shit (and it can, at any point).
To be fair, I wouldn't consider someone who has been in the same company for 13 years an expert on the quality and quantity of devs on the market. Your experience within your one company is not a good indicator of the rest of the market. Most people I know jump jobs every 2-5 years going on to more and more interesting opportunities. Even at these more interesting places, the message is the same: we cannot find enough people.
Job hoppers with 2 years stints can't have any deep domain knowledge. Neither can judge even their own code quality/robustness/usefulness - that way too short period of time.
You're way off. There's a huge diminishing return on how much you can learn at a single job. From my own experience, I'd say it peaks around 6-12 months.
There's some longer-term things you can only learn by being at the same company for a long time (such as "how to grow a startup from 1 to 100 employees" and what comes with that), but those things are usually outside the realm of what software engineers interest themselves with.
If you're spending more than 2 years at the same company, the only domain knowledge you're growing is that of how to be an employee at that particular company, and very little of it is very useful outside of it. Things like "I know how all these Facebook internal tools work inside-and-out". Cool, we don't have or need them cause we're a totally different company, now what?
Whereas a lot of even surface-level knowledge can bring a huge amount of unique perspectives and solutions. And my own most successful projects have been the results of combinations of experience in multiple domains at once.
And none of this prevents you from actually developing deep domain knowledge. For example, I have NOT held the same job for a decade and a half, but I HAVE been using Python (and other various software) for all that time, in a large amount of very varied situations, and I absolutely have "deep domain knowledge" there.
Define "domain". You can have tons of experience in a single domain across many jobs. In fact, I would wager that working in the same domain with differing groups of people, toward differing requirements, gives you a richer understanding of some domains compared to spending 10 years at a single employer.
Yeah, they aren't going to be the best with a specific company's stack, but they'll likely have a better perspective on how it can be improved.
This idea of gathering deep domain knowledge isn't very realistic in my opinion. Not in slow moving, poorly managed companies at least, when compared to being a solo dev or working at a fast paced, well-organized startup.
I built a full stack app in about 6 months, at one point, 5 years ago. (frontend js framework, backend js framework, SQL framework)
I now work for a company whose applications aren't much more complex than what I built, in terms of systems and complexity. Though the apps are definitely much more polished and with many more complete UX & usage options. It makes millions of dollars.
At least at most large companies which are lumbering, slow moving, where 60% of people contribute and 40% of people barely hang on by pretending to work
(largely because A. they don't know what's going on due to poor documentation and poor requirements gathering, and B. they aren't qualified in the skills needed for currently under development work).
As a solo developer, I move much faster and learn much more on my own, than I do at my current job. Why? Because I direct my own work, on my own projects. I don't have someone who has never built an application, who has a project/product manager title, trying to gather requirements for something they've never done before.
Not to mention that they've never founded an organization and led it to success.
So, they don't understand how Apps are built, and the don't understand how Organizations are built and guided efficiently...
Combine this with a company that has a bunch of legacy applications and is now moving into somewhat over-hyped frameworks...
Yeah... Don't get me wrong-- I like the people I work with. I see a lot of under-qualified people in management who are slowing down the system because they can't efficiently organize people to do what they don't understand-- they have too many unknown unknowns in their ability to parse out upcoming work. Unfortunately, most of them were hired for currently-fashionable political reasons.
> I now work for a company whose applications aren't much more complex than what I built, in terms of systems and complexity. Though the apps are definitely much more polished and with many more complete UX & usage options. It makes millions of dollars.
This is exactly what I’ve found. Most of my work is technically easier than my personal projects.
There are definitely parts that are far more complex, but the vast majority of the company isn’t working in those areas.
From what I’ve seen, if you want domain knowledge you have to dig for it.
“job hoppers”. the era of working at the same factory for 30 years is long gone. job huggers have been out of fashion for a while. in modern tech you need to freshen up every few years or else you’re history.
I like how you and everyone below this comment seem to know exactly what my job involves. I've worked on somewhere around 200 different systems and been a/the lead developer on several dozen. I've worked in ~20 different engineering and science domains and gotten enough knowledge that engineers in the individual fields ask me questions about their domains and ask me to QC their Excel spreadsheets and calculations, having nothing to do with software development. I've worked with 6 different database technologies from FoxPro to SQL Server to Redis. I've written code professionally in 8 or 9 languages. I've written programs to calculate statistical outliers, perform mass-balance calculations, do spell/grammar checking, unit conversions, translation, some light AI, and lots more. I've written proposals and made marketing materials. I've worked with hundreds of different people on projects of all sizes.
So everyone assuming that I stopped learning anything new 12 years ago can fuck right off.
I'm not saying this to brag, I'm saying this to point out that all these assertions that "you should change jobs every 2 years or you stagnate" or "you can't learn your job deeply in 2 years" are both completely unfounded bullshit assumptions. It completely depends on what the job is.
I think you're still missing my original point. You have been in your own bubble for a decade. It may have been a very interesting bubble with a lot to learn but it's still a bubble. You don't have a sense of what other companies are looking for because you haven't asked them or put yourself on their radar. You don't have a sense of the level of skill of developers outside of your bubble. I'm not claiming to be an expert here either but you specifically present yourself as being knowledgeable on the subject because of your 13 years with one company, which just doesn't make sense
This is not true, and just taking a dab on a fellow engineer looking for a job is not cool or funny. With that out of the way, most boring (insurance, e-shops, healthcare, logistic, etc...) don't care about keeping anyone(tech) around, so the "if you are good enough, companies will try to keep you" is a lie, at least in the US. Anyone can pickup their tech stack fairly quick. The technique I have seen some "boring" companies use, is to hire very junior folks , convince them they are making progress in their careers , promoting them to SDE1, 2, Senior SDE, Team Lead, etc... (and raising their compensation to a 3-5% each time, and making it a big deal) which means very little if your company doesn't have real infra. Many people settle for a familiar environment(sometimes the only they know) that pays them just enough. In big tech is a bit different(in my experience from Amazon, Msft and Google) , they do have "Dive & Save" money, to keep you from going to the competition.
His was a valid retort to an annoying gripe from highly entitled managers. I've worked with plenty of top level devs. In SVish companies with boatloads of money this may be true for the top 1% of devs (some startups seem to have this culture), but that's just a piece of the tech world.
The reality is that for a massive number of devs working for traditional businesses, there isn't all that much on the table. You can try to move up into the management caste and get big bumps, but you're only going to get incremental advancements.
I actually wonder if any companies have considered massive salaries to be a liability in litigation, because it could be pointed to as an example of unfair wages.
Anyway, you're getting a fancy management job, or some super elite dev job.
As counter, some of the smartest, hardworking people that I know have extremely low self-esteem and self-confidence issues. They do excellent work under horrible conditions and are not properly compensated for their work.
Similarly, I have worked at places that have had "good devs" who didn't have such image problems, that worked at places that did not understand (or appreciate) what it was that they had, and did nothing to keep them on the staff. "Whatever. It's hiring season anyway." After their departure, productivity suffered, and when points were reiterated (that were covered before the person left), the response was something along the lines of a timid, "Oh...I thought you were joking."
Your assumption is constructed around the idea that people will always do what is in their best interest, and be in a frame of mind to appreciate the "big picture" things. In my own experience, this is rarely the case. More often than not, it is a sort of miracle that even some very well-known places are able to stay in business.
I'm sure there are exceptions, of course. Some upper management really are very smart and "keyed in" to the value of specific employees output and are also in a position to do something about it. I would simply argue that those arrangements are significantly less common than you might be led to believe, and that such remedial, reactive measures are unsuccessful far more often than they are successful.
What could I say to convince people that this is not the case? Like I have a small team trying to revolutionize nuclear power deployment performance with a highly customized PLM app but on paper it just seems to sound boring.
Obviously I don't know your specific case, but I can offer a few ideas:
> trying to revolutionize
There are a lot of aspirational companies out there. I'd rather work at an aspirational company than one with no vision at all, but you still need to convince me that there's an actual chance you can deliver on this vision. If you told me you were going to revolutionize the world's energy sector with a perpetual motion machine, I'm obviously passing. Treat me like an investor -- convince me your company has a future. Because, of course, I am an investor; I'm considering investing a large percentage of my professional life in your company.
> nuclear power deployment performance
Nuclear power is interesting but highly regulated. If I come up with new and interesting ideas that contribute to this space, is it too tightly constrained by regulation to have any chance those ideas could get applied? Will I be in a position where anyone will actually listen to my ideas, or have you already made up your mind about how everything is going to work? Am I going to get to talk about the product development, or is this going to be handed down to me with people taking offense if I try to explain why that latest change doesn't make sense with the rest of the application?
And what is "deployment performance" -- trying to cut the costs involved in building and deploying new nuclear power plants? If it is, just say that. Don't make me guess what your company actually does, because I'm going to guess using my priors, and as you probably guessed from my other comment, my priors are very negative.
> a highly customized PLM app
Two red flags here: "highly customized" -- I'm a software developer. I write code (among many other things, of course). What does "customized" mean? Are you building this in some other inner platform? "Customized" sounds much weaker than just plain new code. Will I find Apex code built on SharePoint or something awful like that? "Customized" is a huge red-flag word for me because it's almost exclusively used by people who are afraid of programming and don't understand the software development lifecycle.
The other red flag: "app". You're trying to "revolutionize nuclear power deployment performance" with an ... app? Look, I get that "app" is just short for "application" and therefore could mean anything, including an advanced industry-centric analysis tool. But it doesn't, does it? You wouldn't use the word "app" for that. So you're trying to revolutionize nuclear power with a $2 app in the Apple Store or something? That's what it sounds like to me.
Again, think of me as an investor. If that's your elevator pitch, I've come away with the message "big ideas with absolutely no idea how to deliver on them".
Supply and demand.
The devs are out there, most likely your salary offer is too low.
What makes a dev good in your mind? How many hours a week do you expect the devs to work? Work from home? Is your salary enough to lure them from FAANG or vc backed companies?
There is always a reason the devs are not coming and most of the time it has something to do with work / life balance, benefits or salary or any of the above.
I dont know you, you could be the coolest boss on earth but if something is wrong with the above points then you wont find what you are looking for.
Define good. What specific skills do you want? How much experience in each of those skills?
I have noticed that too many times employers want literal gods to come in and fix their entire tech stack, their product, and by extension their company.
Of course, they are frequently only paying average salaries.
I think it's fairly clear even in his example that not all white collar is created equal. I'm not sure how you'd divide it but certain professions like Doctors and developers are not in the same camp as management/accountants/lawyers. So, for the following quote from the article, who are the people actually necessary for "creating and delivering the goods and services that consumers want and need on a daily basis."
"there is a massive shortage of blue collar jobs like truck drivers, hospitality and restaurant employees, and other employees necessary for actually creating and delivering the goods and services that consumers want and need on a daily basis."
It's been funny seeing the anti-WFH sentiment especially in light of Ed Zitron's (https://ez.substack.com/) counter-arguments. At this point, I'm somewhat convinced that anyone in business railing against WFH either misses crushing worker ants under their thumbs or has a huge investment in commercial REITs.
I know for a fact that I was much less productive working from home. Perhaps there are others in a similar boat who've extrapolated their experience out to the general public.
Managers will always find a reason to add headcount to their departments and orgs. Yeah we will probably see sone layoffs and modesty in things but eventually it gets sorted out and a new bull market takes hold and we are off to the races again. Impossible to see right now and it could be months or years. But it will all come back and we’ll make similar mistakes again.
I'm not disagreeing with your premise about Boomers. The youngest Baby Boomers (birth year 1946-1964) are in their late 50s, although the oldest have been retirement age for 10 years. It's likely that many had postponed retirement, but health and economic issues have forced them into retirement.
On a tangential note, everybody always forgets Gen X. Or, maybe 'Boomer' has become a term for 'someone older than a young adult'?
> health and economic issues have forced them into retirement
For many too, they held on because their job was fun. My neighbor was a shipping container captain… he was eligible for retirement for many years. What finally made him retire was all the rigamarole the pandemic introduced. It stopped being fun.
The same thing happened with a local family-owned Agricultural store near me that’s been around for 70-odd years. Dealing with everything in the past 2 years sucked all the joy out of running the store, so they decided to retire.
Every single time I hear millennial I think avocado toast. And every time I hear boomer I think of strong perfume.
The words meaning sure hadn't changed, but I imagine the longer its mainstream the more it will morph. Soon we won't be allowed to say it at all, lest we offend somebody with the slur.
It's a generational thing. Just like Gen-Z hates the boomers for exploiting real estate the children of Gen-Z will complain their UBI pays too little because millennials exploited massive tech salaries to automate away burger flipping joints and software development
this seems likely the cause (compounded by the pandemic - people shifting industries, online ordering skyrocketing), unfortunately we still have another 10 years of baby boomers.
Curious to what “Gen Z are much smaller than Boomers” is in reference to. Population size or literally physically? (Which neither is true from what I can tell?)
This dude and this article are really not making a great case here.
The last sentence of the tweet is a dead giveaway of the archaic thinking. Why is WFH going to be blamed again? It feels like it's yet another manager's plea for employees to come back to the office and get micromanaged.
The article tries to answer my question:
> Work from home. The work from home trend is threatening office jobs because it reduces the need for many traditional office roles like secretaries and receptionists.
In what world are secretaries and receptionists a prevalent job title in the first place? Is the author of this article retired or something? The first wave of computing already shifted those roles. Executive assistants to the SLT are not just answering phones and twiddling thumbs. On top of that, in the modern world receptionists are one of the most minimally staffed parts of a company already. My last 1,000 employee in-person company had no more than two or three receptionists and they were contracted out to a vendor.
You know what is expanding post-pandemic? Co-working spaces. [1] Technology and the pandemic aren't eliminating offices, they're changing them. Many employees want a separate space to work, or a place to collaborate in person, but they don't want to be forced to go in daily. Companies don't want to have to make long-term lease commitments and dedicate one seat per employee.
Guess who has receptionists? Coworking spaces. But again they aren't sitting there twiddling their thumbs. WeWork and the rest of the modern coworking spaces are highly automated. You can't just ask the reception desk to fix something or make a change to your office, you have to put in a ticket. These receptionists wear many hats: they're salespeople, they set up events, they communicate with tenants, they do light cleaning. Their tasks are too varied to be replaced by an automation, and their salaries aren't that expensive.
> In contrast, the increasingly widespread adoption of enterprise software, data analytics, and machine learnings are increasingly rendering office jobs moot as much of the traditional number crunching, document processing, and repetitive analysis performed by office professionals is now being performed by software.
Automation is driving new business ventures for precisely the reasons the article brings up. Thousands of businesses exist today that couldn't possibly exist because computing wasn't powerful enough. As soon as you automate an enterprise task it opens up more opportunity to provide more complex business products and invite new ventures into the mix making products that didn't exist in the past.
Being able to do more things with less money is basically always a net positive.
> Furthermore, while restaurant, retail, and even factory workers are all expected to be ultimately replaced by robots that tech companies like Tesla (TSLA) are producing, this technology is even further away from being widely deployed, with some predicting these machines to not be meaningfully impactful until the end of this decade.
Not 100% related to my argument, but this quote above is one of the dumbest things this article claims. Anyone who thinks that retail and hospitality workers are going to straight up be replaced by robots is delusional, especially with the timeframe of "the end of this decade."
It costs something like $20/hour to hire a line cook, less in low cost of living areas. Certainly, the future of hospitality has fewer workers per customer, but that's just normal labor optimization. There won't be a fully automated McDonald's, but McDonald's employees work with a lot of automated devices. The dexterity and intelligence of a human is very difficult to replace entirely. Robots aren't going to be talking to customers at the Apple Store, the whole reason people are there is to talk to a person, and retail salaries are already quite low.
And why is Tesla in this discussion? Tesla doesn't produce factory automation robots. They purchase plain jane factory robotics that have been part of the auto industry for decades. It's like the article just wanted to bring up Tesla for no reason.
> As a third strike against the safety of white collar jobs at the moment, most companies have already more than fully recovered their number of white collar jobs from before the COVID-19 outbreak.
The article from this point on is basically talking about the normal boom/bust cycle of business, even though the beginning of the article and Burry's tweet are talking about permanent decline.
I'd say that this part of the article unintentionally makes the opposite argument it's trying to make. It is directly stating that blue collar job demand is worse, and white collar jobs recovered almost immediately.
What is the actual structural reason why companies won't hire more white collar workers? You can't just spew out "automation" because the past 20 years or so has shown us the opposite trend in relation to automation. White collar workers are the foundation of automation companies.
But it seems to be very far behind its competition.
Also, for evidence on what I was talking about with McDonald’s, just jump on YouTube and search “McDonald’s POV,” good luck automating all that and beating the human and simple specialized tools on performance.
WFH will reduce middle management and "prestige" employees as more efficient workflows evolve from the old "filing cabinets and typewriters" methods that some companies are still using to this day.
Automation is coming, and it is definitely coming for the white collar workers who act as human CRUD apps.
The ability to collate, search, and analyze data used to require labor power, it does not anymore.
I watch my partner's experience in accounting and the difference are stark.
There are many many bs steps in accounting with paper that disappear when digital tools are used.
Having physical pieces of paper as a fungible business tool is still something used by a lot of firms but they are simply behind the times, there are relatively simple software solutions for most paperwork needs and this reduces labor power required.
"We've always done it this way" is an excuse for firms that will fall behind other digitally empowered firms.
What does this mean for investors? Don't invest in companies that insist on using filing cabinets.