Avsnitt

  • Originally published via Future Commerce

    Have you heard of the infamous “Betty Crocker Egg” story?

    It goes:

    During the 1950’s, sales of instant cake mixes were struggling. A worried General Mills, owner of the Betty Crocker brand, brought in consumer psychologist Ernest Dichter (creator of the focus group) to conduct interviews with housewives.

    In his discussions, he learned that housewives’ guilt from the effortlessness of the instant cake mix made using the product “too simple.” The process (or lack thereof) was self-indulgent “cheating” compared to the more rewarding process of baking from scratch. Therefore, the mix was a problematic buy.

    An insight and opportunity: “What if we left out the powdered eggs from the mix and allowed people to add fresh ones themselves, increasing participation, decreasing guilt, and ultimately increasing sales?”

    It worked. Once the new cake mix requiring fresh eggs was released, sales of the product began to soar — a win for both the baker and brand.

    This story reveals the seemingly irrational consumer mind and is a case study of the importance of in-person qualitative research. Only by looking beyond market data could we learn about “premium friction” or that the opposite of a good idea (e.g., more work, not less) may also be a good idea. For this reason, marketers, strategists and innovators alike love sharing it.

    The Betty Crocker tale supports “The IKEA Effect,” a cognitive bias coined by behavioral economist and author Dan Ariely. As proposed in his study, by putting together our furniture (rather than buying it pre-assembled), we create a unique, more personal relationship with it, increasing the perceived value of our creation. Like requiring fresh eggs, our participation changes perceived value.

    But here’s the problem:

    The “Egg Story” as we know it is bullshit.

    Critical Omissions and Confirmation Bias

    Why is it bullshit? It’s missing critical nuance.

    There are five missing details which re-tellers leave out:

    First, Dichter’s findings include, but no one acknowledges, that fresh eggs produce superior cakes.

    Author and historian Laura Shapiro confirms this overlooked truth in Something from the Oven: Reinventing Dinner in 1950s America:

    "Chances are, if adding eggs persuaded some women to overcome their aversion to cake mixes, it was at least partly because fresh eggs made better cakes."

    The original dry egg mix produced cakes that stuck to the pan, burnt quickly, had a shorter shelf life, and tasted like eggs. We knew fresh eggs made for better cakes because...

    Second, a patent for fresh eggs in cake mixes was first filed in 1933, decades before Dichter discovered their “psychological importance.” The original patent reads:

    “The housewife and the purchasing public in general seem to prefer fresh eggs...”

    Companies were debating dry vs. fresh eggs since the very inception of the cake mix product, not just when “sales were struggling.” (More on that in a second.)

    Paul Gerot, CEO of Pillsbury during the time called the egg mix “The hottest controversy we had over the product” from the get-go. The story makes it seem like fresh eggs were this novel discovery. In reality, these companies had been debating them for years.

    Third, around this time, cake mixes were actually selling incredibly well, but only when they weren’t flying off of shelves did it cause worry.

    Between 1947 and 1953, sales of cake mixes doubled. The concern only arose during the late ‘50s, when there wasn’t a “decline,” but just a modest +5% growth — a “flattening,” if anything.

    Cake mix sales didn’t suddenly flatten at once because of a mass onset of guilt... especially after years of excitement and growth. There are endless explanations for a flattening such as novelty wearing off, market saturation, product competition or evolving tastes.

    The story makes this seem like a brand problem when in fact it was a shared category problem. Which brings us to...

    Fourth, when sales stalled for the category, General Mills and Pillsbury adopted two different strategies:

    General Mills required the fresh egg, while Pillsbury offered the complete dry egg mix.

    If the fresh egg were such a business-saving idea, we should have seen Betty Crocker wipe Pillsbury out of business.

    We didn’t. Pillsbury still thrives today.

    And fifth, while Dichter was onto something with baker participation, the egg shouldn’t be the star of this story; it should be icing.

    According to historian Laura Shapiro, it wasn’t as much the fresh eggs that brought cake mixes back from their slowed growth but inspirational advertisements empowering homemakers to decorate their cakes with extravagant and personal flair. The introduction of frosting and elaborate decorations turned excitement away from the cake's inside and taste, to the cake's exterior and splendor.

    This is how General Mills and Pillsbury brought cake mix sales back to life — through the broader cultural turn towards the mimetics of homemaker aesthetics.

    As Michael Park writes for Bon Appétit on the history of cake mixes:

    “It didn't hurt that slathering a cake-mix cake with sugary, buttery frosting helped mask the off-putting chemical undertones that still haunted every box. It worked. By the time the over-the-top cake-decorating fad was over, cake mixes had invaded the average American kitchen, and have been there ever since.”

    There we have it: the full story of Dichter and Betty Crocker’s egg.

    Alternative Truths Form Their Own Realities

    But with these untold truths now laid out, new lessons emerge.

    First and foremost, nuance isn’t fun and doesn’t make for biting hot-take lessons on social media. Details are inconveniences and potential contradictions when pithiness spreads. It reminds us to question our viral headlines: “What’s missing?” Something always is.

    When an alternative “truth” like the almighty fresh egg eclipses the real truth (the complete story of egg patents, real sales figures, and icing), new realities form. Stories don’t have to be true to be effective – they just have to sound right; or confirm our existing biases.

    The world we perceive is manufactured from the stories we hear.

    Any narrative which prevails becomes “the truth,” whether it’s complete or factually correct. Perception is reality and reality is only the stories available.

    But aren’t all of our stories made up? Isn’t everything just a social construct?

    In 2021, Dan Ariely (author of that “The IKEA Effect” paper) was accused of manipulating his data in a 2012 study after other researchers could not replicate results and considered his raw data suspiciously manipulated when compared with the published findings in the study.

    The paper was later retracted.

    The he-said-they-said drama thickens. While Ariely claims he didn’t touch the raw data provided to him, in a statement to NPR, The Hartford (the originator of the consumer data) insists that someone altered the data after they gave it to him.

    There’s different data, but neither party supposedly altered it.

    Meanwhile, Harvard Business School professor Francesca Gino, a collaborator of Ariely, is currently on leave after being accused of falsifying her data... from the same exact 2012 study that Ariely is accused of.

    The kicker: this paper is about “nudges” to prevent people from lying.

    Over the last decade, behavioral economics has become a thrilling topic for psychologists, marketers and all interested in the mind. Ariely puts it best as his book title — we’re Predictably Irrational. One study by Gino claims silently counting to 10 before deciding what to eat can increase the likelihood of choosing healthier food.

    It’s now common to believe that small “nudges” like requiring a fresh egg can influence our psyches at scale. Further, our behavioral idiosyncrasies can be distilled down and explained by simple cognitive biases.

    Or maybe not.

    What if we don’t know as much as we think we do about what’s happening inside our brains. Maybe we can’t actually explain stalling cake mix sales. And maybe signing the top of a piece of paper doesn’t actually prevent lying as Ariely and Gino’s paper suggests.

    Maybe we just don’t know.

    And that’s okay.

    Our current moment with UFOs, or now UAP’s, “unidentified anomalous phenomena” is a great exercise.

    When a former Navy pilot testifies in-front of Congress that the Pentagon is hiding evidence of alien spacecrafts and knowledge of “non-human remains” we’re left with an opportunity.

    The real truth here is that we’ll never ever get the full truth. We’re invited to admit, “Maybe we’ll just never know.”

    Wonder, awe, mystery and unknowing are beautiful traits that our AI competition will never experience. Bask in naivety. One of our fatal flaws is our adamancy that we know how everything works. How may we be happier or more productive if we were mindful of our hubris?

    On a warpath for not just truth, but an easy truth, we overlook other valuable lessons: the world cannot always make sense, personalized participation (icing) is more effective than conventional participation (eggs), and elusive focus groups may not reveal as much as some extensive desk research may.

    Yes, there’s a purpose for the half-truthed Betty Crocker tale, but there’s much to be learned in full-truthed tales. We should be open to the full truth as much as we’re open to admitting we just don’t know.

    And if there’s a pithy story to come from the Betty Crocker tale, it’s not about the power of participation. It’s that maybe your product is just crap.



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • To forecast our future, we have to identify patterns of change early. But rather than only seeking out collections of signals representing growth, it behooves us to simultaneously study what’s crumbling – signals of decay.

    After all, growth stems from deep fractures.

    One of today’s most glaring fractures worthy of our attention is higher education. The changing landscape of higher education is ground zero for radical social change and required innovation.

    Good news!

    TVs, toys and software have never been cheaper in human history.

    Bad news: College tuition and textbooks have never been more expensive.

    This is according to the Bureau of Labor Statistics which has been tracking the prices of consumer goods and services relative to inflation for the last two decades.

    College tuition — second to healthcare — is the most “increasingly expensive” buy in America.

    How coincidental that these are two of the most important purchases one can obtain, and certainly the sort which more should have access to, not less?

    According to the National Center for Education Statistics, in the 1968 academic year, it cost $1,545 to attend a public, four-year institution (including tuition, fees, room and board).

    In 2020, it was $29,033.

    For the fifth of college students attending private schools, that figure is significantly higher.

    Noteworthy as the cost of (manufacturing) education and textbooks have not risen at the same rate.

    Is it any more expensive to “produce” education today?

    This is perhaps why NYU, among many schools across the country, are developing “Schools for Professional Studies” — certificate program alternatives dedicated to furthering education during a moment when traditional degrees are slipping.

    According to a report from the National Student Clearinghouse Research Center, the number of students who earned undergraduate degrees fell by -1.6% in 2022, reversing nearly a decade of steady growth.

    As of last year, only 51% of Gen Z are interested in pursuing a four-year degree, down from 71% a couple years earlier. The pandemic and Zoom screens have put things into focus. And students’ parents are on the same page: nearly half of parents don’t want their kids to go straight to a four-year college.

    Graduate degrees are falling out of favor just as dramatically.

    For The Wall Street Journal, Lindsay Ellis reports,

    “At Harvard, widely regarded as the nation’s top business school, M.B.A. applications fell by more than 15% [in 2022]. The Wharton School of the University of Pennsylvania recorded more than a 13% drop. At other elite U.S. programs — including Yale University’s School of Management, as well as the business schools at the University of Chicago and New York University — applications dropped by 10% or more for the class of 2024. Cost was the biggest factor blunting demand.”

    Meanwhile, this decline is about to worsen — not just because of prices and attitudes, but because of significant demographic change.

    Kevin Carey, VP for Education Policy at New America, a think-tank, wrote for Vox:

    “[In 2026] the number of students graduating from high schools across the country will begin a sudden and precipitous decline, due to a rolling demographic aftershock of the Great Recession. Traumatized by uncertainty and unemployment, people decided to stop having kids during that period.

    But even as we climbed out of the recession, the birth rate kept dropping, and we are now starting to see the consequences on campuses everywhere. Classes will shrink, year after year, for most of the next two decades. People in the higher education industry call it ‘the enrollment cliff.’”

    Like any business facing disruption, many are pivoting to diversify revenue.

    Earlier this year I learned NYU was growing its Marketing certificate program for those seeking to enter the field or gain more experiences from practicing experts. I raised my hand and began the process to volunteer as an Adjunct Professor at night.

    I reviewed syllabi, audited classes, and had planning talks which spanned months. The paperwork finally began and I was set to lead seminar discussions for NYU’s Fundamentals of Advertising course.

    That was until the provost learned I, myself, didn’t hold a Masters degree.

    I was axed.

    Despite my desperate pleas, they “weren’t interested in having any further discussions with me.”

    The very institution struggling to keep up with the evolving education landscape by providing degree alternatives, couldn’t fathom that anyone without a postgrad degree could be qualified to provide their students value.

    Or they did until they learned I wasn’t their ilk.

    It was flawed logic academics would have loved to call out.

    As a practitioner with a decade of marketing strategy experience and current guest speaker at schools including Yale, Parsons, Queens College, Franklin & Marshall, University of Oregon, and oh, NYU already that month, my lack of formal degree called for immediate disqualification.

    Meanwhile, NYU’s certificate program was left with a gentleman deconstructing TV ads from the 90s to a dozen black Zoom screens. (Mind you: TV is the fastest declining advertising medium in this field's history, and this instructor’s “acceptable” higher education was a law degree.)

    Am I still salty? Isn’t it clear?

    It was utterly perplexing and hard not to take personally.

    But then: clarity.

    In a moment when higher education must be reworked and reimagined, perhaps institutions themselves may not be the best qualified providers for our required alternatives.

    There’s inherent conflict and rigidity preventing these educational gatekeepers from offering a fair and valuable alternative.

    As much as they want to bet on red — alternatives — they must simultaneously bet on black — protecting their historic brands and upholding the value of the traditional degree.

    This may be an incompatible strategy.

    This is not about me and my opportunity to teach, but about students’ return on investment, experiences and preparedness.

    A genuine commitment to education would mean an integration of more diverse perspectives, and valuing practitioners’ experiences over their (lack of) degrees. Failure to prioritize students’ outcomes will only accelerate the exact reason why NYU has to develop an alternative in the first place.

    NYU isn’t incentivized by its Continued Learning cohorts. It’s incentivized by its astronomically priced degrees and brand.

    I was deemed a valuable asset for students until the institution learned I was at the same “academic achievement tier” as their students. My overwhelming passion, practical expertise and ultimately the students’ education were all overlooked, deprioritized.

    As Clay Shirky, (coincidently, Vice Provost at NYU), put it in Bryan Alexander’s book Academia Next: The Futures of Higher Education,

    “The biggest threat those of us working in colleges and universities face isn’t video lectures or online tests. It’s the fact that we live in institutions perfectly adapted to an environment that no longer exists.”

    Alexander, a futurist and senior scholar at Georgetown University, wrote,

    “Much of American higher education now faces a stark choice: commit to experimental adaptation and institutional transformation, often at serious human and financial costs, or face a painful decline into an unwelcoming century.”

    And lastly, as Anthropologist Grant McCracken puts it,

    “The university that cannot fix itself is disqualified from educating our young.”

    The Value of Edu

    To envision solutions for higher education and continued learning, we have to understand the current landscape and how we got here.

    Understanding begets autonomy and action.

    Institutional higher education is headed straight off a cliff.

    If we agree not everyone has to or should get a four-year-degree from a university, why is this institutional implosion problematic?

    Environments for emotional and intellectual growth are critical for both culture and society. For individual and collective growth, we should be fighting to ensure increased opportunities for people to explore new subjects, broaden worldviews, and develop critical thinking skills. This ultimately leads to increased curiosity, creativity and innovation — attributes which develop more informed and engaged citizens in our democracy.

    This isn’t happening.

    The inverse is — less young adults are obtaining such experiences.

    The recent ruling on affirmative action will further widen the access gap to educational opportunity. How will colleges, employers, and organizations maintain their commitment to diversity and inclusion? Considerations might include new selection criteria for systems of admission, rigorous outreach programs for increasing the number of minority applicants, and stronger partnerships between high schools and postsecondary institutions with an emphasis on matriculating students who face adversity.

    This decision can’t be the final word, according to President Joe Biden. He’s right. Without affirmative action, universities will be limited in their ability to consider race, ethnicity, nationality and socioeconomic status in the admission process. But that doesn’t prevent them from iterating upon decades worth of progress in diversifying America’s campuses.

    We can no longer rely upon institutions to be the sole providers of continued education in our futures.

    Financially, it no longer makes sense.

    Approximately 44M Americans have student loan debt, amounting to more than $1.6T by the end of last year — roughly the GDP of the state of New York or the entirety of the country Spain.

    While Federal Reserve data reveals adults under 30 are more likely to have student loan debt compared to older adults, nearly a quarter of outstanding student loan debt is owed by Americans over 50. This multi-generational burden restricts social mobility and cultural participation. Debt prohibits.

    Meanwhile, the Supreme Court blocked Biden’s plan to forgive $430B in student loan debt. A for effort.

    Is the price of education even worth it?

    According to the College Board,

    “In 2021, full-time workers 25 and older with a bachelor’s degree out-earned those with a high school diploma and no degree by about $27,000/yr.

    But over half of grads from public four-year institutions left with federal debt averaging over $21,000.”

    When many students’ main motivation for attending college is to increase their earning potential, learning that they may just break even changes things.

    ROI (return on investment) also gets complicated when you take a deeper look into cost vs. earning potential...

    Per The College Payoff report from Georgetown University’s Center on Education and the Workforce:

    “31% of workers with no more than a high school diploma earn more than half of workers with an associate’s degree.

    Likewise, 28% of workers with an associate’s degree earn more than half of workers with a bachelor’s degree, and 36% of workers with a bachelor’s degree earn more than half of workers with a master’s degree.”

    Simply put, even when ignoring debt, higher education does not guarantee higher earnings. It helps. But broad stroke averages of degree-holders’ earnings obfuscate the reality:

    Millions of people with lower levels of education are making more than those with higher levels of education.

    Perhaps for this reason, 56% of respondents to a recent survey said a four-year degree was a “bad bet.”

    For some real gut-wrenching numbers...

    Melissa Korn and Andrea Fuller write in their Wall Street Journal piece Financially Hobbled for Life: The Elite Master’s Degrees That Don’t Pay Off,

    “Recent film program graduates of Columbia University who took out federal student loans had a median debt of $181,000. Yet two years after earning their master’s degrees, half of the borrowers were making less than $30,000 a year.

    At New York University, graduates with a master’s degree in publishing borrowed a median $116,000 and had an annual median income of $42,000 two years after the program.

    At Northwestern University, half of those who earned degrees in speech-language pathology borrowed $148,000 or more, and the graduates had a median income of $60,000 two years later.

    Graduates of the University of Southern California’s marriage and family counseling program borrowed a median $124,000 and half earned $50,000 or less over the same period.”

    They conclude,

    “Highly selective universities have benefited from free-flowing federal loan money, and with demand for spots far exceeding supply, the schools have been able to raise tuition largely unchecked.

    The power of legacy branding lets prestigious universities say, in effect, that their degrees are worth whatever they charge.”

    When the government guarantees loans, institutions can make the price whatever they want. As lawyer and writer Paul Skallas puts it,

    “It’s a subsidy without any type of price regulation.”

    Buyer beware.

    In a recent report from the GAO (U.S. Government Accountability Office),

    "91% of schools don’t properly list their net price, or the amount a student is expected to pay for tuition, fees, room, board and other expenses after taking into account scholarships and grants.”

    The fine print is hidden. And to be clear, this isn’t just a U.S. problem.

    It’s perhaps why 41% of U.K. students have considered leaving their courses due to money issues, and 68% are unable to afford course materials.

    For decades, tens of millions of young adults have mindlessly enrolled in institutions under the presumption they must in order to 1. become employed and 2. earn more.

    Only now, the financial repercussions and ROI cannot be ignored.

    How’d this get so out of hand?

    A Reversal of Figure & Ground

    Oh, rankings.

    The college and university leaderboards are the culprit of the current race to the top.

    In business, a “race to the bottom” is a strategy where product price or service quality is lowered (i.e. sacrificed) in order to obtain more customers, market share or otherwise gain a competitive edge. Think: Amazon selling cheaper and cheaper products (even at a significant loss) in order to get more shoppers to habitually use its platform.

    In education’s case, institutions have entered a “race to the top,” a strategy to rank higher and higher in order to gain their competitive edge. But here, education is sacrificed while ranking requirements and therefore position becomes prioritized.

    Ranks have become a heuristic shortcut to determine the quality of education and therefore job preparedness... which then means organizational value, and therefore earning potential.

    It’s flawed.

    Yet for both institutions and students it’s hard to think otherwise. We’re missing the big picture.

    Rooted in Gestalt psychology, this larger phenomena is called “a reversal of the figure and ground” — whereby the original purpose of something (ex. education quality) becomes eclipsed by a once secondary element (ex. alumni giving). Today, the latter has become more important than the former.

    Media theorist Douglas Rushkoff explains the original reversal of the figure and ground in education in his book Team Human,

    “Public schools were originally conceived as a way of improving the quality of life for workers.

    Teaching people to read and write had nothing to do with making them better coal miners or farmers; the goal was to expose the less privileged classes to the great works of art, literature, and religion.

    A good education was also a requirement for a functioning democracy. If the people don’t have the capacity to make informed choices, then democracy might easily descend into tyranny.

    Over time, as tax dollars grew scarce and competition between nations fierce, schools became obliged to prove their value more concretely. For the poor in particular, school became the ticket to class mobility. Completing a high school or college education opens employment opportunities that would be closed otherwise.

    But once we see competitive advantage and employment opportunity as the primary purposes of education rather than its ancillary benefits, something strange begins to happen.

    Entire curriculums are rewritten to teach the skills that students will need in the workplace. Schools consult corporations to find out what will make students more valuable to them. For their part, the corporations get to externalize the costs of employee training to the public school system, while the schools, in turn, surrender their mission of expanding the horizons of the working class to the more immediate purpose of job readiness.

    Instead of compensating for the utilitarian quality of workers’ lives, education becomes an extension of it. Where learning was the purpose — the figure — in the original model of public education, now it is the ground, or merely the means through which workers are prepared for their jobs.”

    The figure and the ground has reversed around education and brand.

    Ranking, or brand name, is now in the driver’s seat, optimistically determining employment and earnings. Life preparation and measurable ROI are now moot, left in the dust.

    For those who decided (or still decide) to enroll, really, it’s the paper degree, network and ranked logo which matter most.

    As NYU professor and marketer Scott Galloway puts it,

    “The ultimate luxury brands in the world are now American universities.”

    Higher ed’s sold product is now just the brand itself.

    It’s gone meta.

    We’ve lost the plot.

    Colin Diver, a former college president, university dean and author of Breaking Ranks: How the Rankings Industry Rules Higher Education and What to Do about It, reminds us,

    “Every step in the [ranking] process — from the selection of variables, the weights assigned to them and the methods for measuring them — is based on essentially arbitrary judgments.

    U.S. News, for example, selects 17 metrics for its formula from among hundreds of available choices.

    Why does it use, say, students’ SAT scores, but not their high school GPAs? Faculty salaries but not faculty teaching quality? Alumni giving, but not alumni earnings? Why does it not include items such as a school’s spending on financial aid or its racial and ethnic diversity?

    Lurking behind data manipulation lies the even larger problem of schools altering their academic practices in a desperate attempt to gain ranking points.”

    As the consultancy McKinsey rightly points out, institutions focusing on the same criteria lead to greater homogenization, while we should instead be focusing on more distinctiveness.

    All institutions attempt to emulate the elite.

    Today, institutional spending on student services (food, gyms, events, stadiums, etc.) is now growing 4x as fast as instruction.

    But for institutions, deciding to roll back these amenities doesn’t mean others will too... they’ll now just be the weaker choice.

    With all due respect, the ways in which young adults with not yet fully developed brains are told that these luxuries are somehow related to their education, and further, are encouraged to see college ranking as synonymous with their personal worth is unnecessarily anxiety-provoking, maliciously deceptive and borderline fraudulent.

    So as it all falls apart, what can we do?

    There’s two ways to approach our moment:

    We’re likely too far gone to revert back to our original figure and ground. If higher ed is now the means to secure employment and social capital, institutions must do a better job at adapting to this reality.

    But second, this is not to say the ethos of exploration and personal development cannot remain embedded within institutional programs, or that these two approaches can not co-exist.

    They can.

    The future of education requires a “yes and” mentality: theory and practice.

    Changes & Solutions Spaces

    Institutional higher education is in free fall, its future is unwritten. As tuition rises, enrollment drops, and confidence in institutions continues to decline, institutions simultaneously face new conditions, namely the growing demands of employers, the new expectations of learners, and the forceful cross-winds of technology and inequality.

    Prioritizing the student, we see five unique spaces where unprecedented change will manifest, and that’s where our attention is required.

    01. Re-evaluate Employment Qualifications

    With all that’s been discussed, there’s rightfully been a loosening of degree requirements, permitting more diverse and equally qualified — and sometimes more qualified — talent to enter the workforce.

    Between 2017 and 2019, employers loosened degree requirements for 46% of middle-skill occupations and 31% of high-skill ones. We should expect these numbers to rise. Good.

    While Maryland, Pennsylvania, Utah, and Alaska will no longer require four-year degrees for thousands of state jobs, Obama and Biden are continuing to encourage more companies to strip their degree requirements in what’s being dubbed the ”Paper Ceiling” and “Diploma Divide.”

    College shouldn’t be the only pathway for prosperity. Full stop.

    This top-down normalization will be incredibly effective at changing the culture, lessening the pressure of taking on debt and encouraging the pursuit of education outside the institution.

    According to Kate Naranjo, an Economic Mobility Policy Advocate and Non-Profit Strategist,

    “If we want to rebuild economic mobility, we must see both college and alternative routes as viable ways to build a thriving labor force and a path to the middle class. [...] We already know college alone cannot fix declining economic mobility because it does not pay off equally for all workers.”

    To redefine how we label someone as “qualified” — beyond merely eliminating degree requirements — we must also accelerate skills-based hiring allowing the over 70 million workers over the age of 25 who are “skilled through alternative routes” (i.e. STARs) a chance to compete for jobs for which they are trained in. These skill-based trainings are a sound investment and an opportunity for partnership and experimentation.

    As interest in degrees slip, it only makes sense to deprioritize credentialism and disempower the chokehold that elite institutions and college rankings have on society’s understanding of merit.

    But for as long as we still see college tethered to employment, we have to address institutions’ (evolving) role regarding qualifications — their online certification programs and classes are just a start.

    Part of this onus explicitly falls on career services. Less emphasis should be placed on basic job information and more on equipping students with robust professional networks and forming partnerships with employers which lead to good-paying jobs in graduates’ field of study.

    For example, the University of Tennessee announced the Tombras School of Advertising and Public Relations in partnership with Tombras, a Knoxville-based ad agency. According to the vision, students will be taught in classrooms which resemble spaces in the industry, and curricula will be continuously enriched by insights from Tombras. This way, coursework will stay relevant and useful as a result of the direct feedback and communication between college and employer – a dynamic which is scarce. This will also effectively increase the advertising industry’s diverse representation, confronting the “talent shortage” often expressed by employers.

    Inarguably, courses taught with Tombras will develop better prepared employees.

    But why enroll in the University of Tennessee to receive an education from Tombras, if one could just receive a Tombras education (and then Tombras job) without the University of Tennessee as a middle-man and its ludicrous tuition? How can this happen?

    While evolving requirements and partnerships to address employment are needed, they don’t wholly rectify the problem of access and quality of the education itself.

    02. New Environments & Teachers

    In “Future 2043,” a report by innovation consultancy SpringWise forecasts:

    “School will likely be a concept rather than a place where our children go to learn.”

    When “education” is reduced to information — we recognize how absurd the price tag of institutions has become. Remember: the university model was concocted before the internet. Never before in history have students had this much access to the world’s best material (books, lectures, videos, experts, interviews, galleries, etc.) — much of which is currently free.

    It’s like continuing to ask people to pay for individual songs via iTunes when the norm is free, ad-sponsored or unlimited music for just a small fee. But instead of $9.99/mo., it’s $2,132/hr.

    The value of education is not just the information, but the environment in which it’s learned in — more precisely, the physical or virtual spaces, and those who inhabit them: teachers and students.

    MOOCs (massive open online courses) have had some traction — Coursera, an online education platform, alone has reported +100M users, 7,000 courses and found 87% of students reported career benefits like a new job or promotion.

    But what you gain in scale you lose in that valuable component: environment.

    As a result, one study found that only 3% of MOOC participants finish a course.

    Environment is absolutely critical when education becomes an online video.

    There must be something in between a free MOOC and an exclusive, pricey university campus.

    What does this look like? That’s for us to decide.

    Whatever it is though, it allows us to address various learning preferences, become more flexible in every respect, and integrate countless forms of diversity – particularly age if we want a future of lifelong learners.

    It's an immeasurable opportunity... and brands are perfectly poised to substitute teach. This is a real opportunity for brands which fight everyday to position themselves as trustworthy experts and have existing cultural and financial capital.

    What is the professorial role of a brand in a future where there are no walls to a classroom?

    Imagine: Agriculture & Logistics brought to you by Sweetgreen, Physiology by the NBA x Alo, Network Dynamics by Verizon, or Intro to Film & Animation by Disney. It’s not that ludicrous. Toyota already has a technical training program in partnership with vocational schools, McDonalds offers a leadership and management degree at Hamburger University, and Grow with Google is now teaching countless courses including CyberSecurity.

    But again, the environment is key. We can’t forget that. A video and chat room won’t cut it here. But you don’t need a campus or football stadium either.

    Wherever or whoever is teaching, intimacy, accountability and community is what must be woven in. The latter, community, is perhaps the most important. After all, “network” remains one of the most attractive selling-points of institutions today. When envisioning our alternatives, who is within these environments and what are the opportunities for them to connect, collaborate and benefit from one another?

    Meanwhile, there’s also the bottom-up approach to these new environments and teachers — DAOs (Decentralized Autonomous Organizations) notwithstanding.

    There are now countless platforms for allowing anyone to teach — for better or worse. There’s Podia, Teachable, Kajabi, Thinkific, Memberful, Loom, Circle, Studio, Course Studio and Maven for starters.

    Imagine an immersive Masterclass for the Long Tail.

    A couple of years ago Emily Gillcrist founded Vital Thought, a human-centered education and consulting platform, which allows PhD experts to teach intimate cohorts, monetizing their experiences, and also allows the general public to receive invaluable insight.

    Courses range from “Games and Gods” and “Earth Systems Theory,” to “Ethics of the Dead: Remains in Museums” and “Science & The Occult.”

    Gillcrist shares,

    “In order to really democratize education we need to invest not only in the distribution and access to information and content, but we need to support teaching and environments where learning can be cultivated over significant periods of time.

    Education is not simply a download.

    I believe that living human beings are the arbiters of knowledge and value, they are the ones who can develop the expertise to interpret the meaning and value of the hordes of complex and often contradictory information being relentlessly collected in every corner of society.

    That ability to interpret is indispensable right now.

    I am aiming to cultivate spaces that function as contexts of meaning, so experts can share their insight with people who want to listen and learn while sharing and enriching their own views and those of everyone in their cohort.”

    In 2006, the movie Accepted premiered, a comedy about mis-fit high school graduates who build a fictitious college to get their parents off their backs. They call it The South Harmon Institute of Technology or S.H.I.T. But in the end, they learn to co-teach one another in exotic yet exciting and practical courses. The cornucopia of passionate learning pods, buck the status quo, reject conformity and kindle their own special environment. At the film’s rising action, Justin Long pleads with the established university that’s trying to shut down S.H.I.T.,

    “Why can’t we both exist?”

    In Accepted, the ultimate draw for students was the ability to co-author their curriculum. They studied what they wanted.

    2-in-5 American college graduates regret their major. For those with humanities and arts degrees, that number is nearly half.

    But there’s a reason behind why these students originally select these subjects in the first place: curiosity.

    Our solution isn’t to sway students into fields where they feel their degree is ultimately worthwhile.

    Perhaps it’s finding the practicality of their interests, explicitly connecting it to employment preparedness, and also offering opportunities to explore interests for exploration sake.

    We can decouple employment preparedness and intellectual curiosity, offering solutions for students simultaneously. These offers don’t even have to come from the same education provider.

    Call it: Holistic Education.

    03. HEAL ≥ STEM

    ...But for those who see higher ed’s sole purpose as the gateway for a related career, we need preparedness for jobs not only in STEM but also in what Senior Fellow at the Brookings Institution Richard Reeves calls “HEAL” jobs (Health, Education, Administration, and Literacy). This is particularly critical for men.

    In Reeves’ book, Of Boys and Men, he shares,

    “In 1980, women accounted for just 13% of jobs in the STEM field (Science, Technology, Engineering and Math); the share has now more than doubled, to 27%. But the same is not true in the other direction. Traditionally female occupations, especially in what I call the HEAL fields have, if anything, become even more ‘pink collar.’

    Just 26% of HEAL jobs are held by men, down from 35% in 1980. The gender desegregation of the labor market has so far been almost entirely one-way.

    While women are doing ‘men’s jobs,’ men are not doing ‘women’s jobs.’”

    It is imperative not to prioritize career fields within STEM over careers in HEAL.

    Each field serves a unique purpose within society and must be presented objectively as an option for consideration, despite any stereotypes associated with each.

    According to Reeves, the main reasons why we need more men in HEAL is to 1. address the decline in traditionally male occupations, 2. provide a better service to boys and men, and 3. help meet labor shortages in critical occupations like nursing and teaching.

    If women belong in STEM, then men belong in HEAL.

    This is certainly true from a moral standpoint. But it is also pragmatic when thinking about one way to address the absolutely staggering humanities crisis within universities.

    Take psychology, for example:

    “The proportion of men in psychology has dropped from 39% to 29% in the last decade. Among psychologists aged 30 or less, the male share is just 5%.”

    This gender imbalance establishes a false perception that only women are best suited to be psychologists when, in fact, the occupation demands perspectives and representation from all genders.

    And this doesn’t just apply to psychology. The share of male teachers is incredibly low where men account for just 24% of K-12 teachers, down from 33% in the early 1980s.

    “Only one in ten elementary school teachers are male. In early education, men are virtually invisible. It ought to be a source of national shame that only 3% of pre-K and kindergarten teachers are men.”

    Humanities enrollment has plummeted in recent decades due to women flocking to STEM and men avoiding HEAL — often guided by the notion that a STEM degree enhances their job prospects and income potential, which is understandable (but not an absolute truth).

    With this, we need to change the narrative surrounding HEAL like we’ve managed to do with STEM so that every student feels encouraged, not forced, to pursue an occupation in this field, so long as it’s actually an interest.

    04. With, Not Against AI

    While we work on laxing demands around degrees, and envision alternatives, we must also address (and call on) the elephant in the room — AI.

    Artificially intelligent text generators (AITGs) like ChatGPT are already disrupting classrooms.

    Students are using these tools to complete assignments, and immediately, educators are voicing concerns around students’ over-reliance on them. (Not to mention the debates around whether or not the college essay is dead or alive, further questioning qualifications.)

    In fact, the first year of college with the introduction of AI ended in “catastrophe,” according to professor Ian Bogost and writer with The Atlantic.

    He forecasts how this tech will continue challenging the traditional dynamics of higher ed institutions:

    “One way or another, the arms race will continue. Students will be tempted to use AI too much, and universities will try to stop them. Professors can choose to accept some forms of AI-enabled work and outlaw others, but their choices will be shaped by the software that they’re given. Technology itself will be more powerful than official policy or deep reflection. Universities, too, will struggle to adapt. Most theories of academic integrity rely on crediting people for their work, not machines. That means old-fashioned honor codes will receive some modest updates, and the panels that investigate suspected cheaters will have to reckon with the mysteries of novel AI-detection ‘evidence.’ And then everything will change again. By the time each new system has been put in place, both technology and the customs for its use could well have shifted.”

    If college campuses plan to persist into the future, and if we are to develop higher ed alternatives, all must accept the responsibility of reevaluating how they perceive, address, and work with technology.

    Academia should neither view technology as a threat to learning nor a tool for cheating, but rather, an opportunity to teach students how to be honest and adaptable in an evolving society, and further as a scalable, intellectual sparring partner.

    Rather than wait or avoid the inevitable, IE University in Madrid, Spain already dove in, actively reinventing higher education by welcoming AI into the classroom as an essential part of the future of higher education and a valuable addition to the pursuit of scholarship.

    But with all of these possibilities comes two stark reminders:

    First, ask ChatGPT to answer, “What weighs more? Five pounds of feathers or one pound of bricks?” And it will answer, “The bricks.” That’s because the AI is not actually intelligent as much as it is exceptional at guessing which word should come next in its answer. There is no real logic here. So if we’re to throw these tools in the classroom, and increasingly rely upon them in “the future of work,” it suits us to strengthen our own logic, reasoning and critical thinking.

    And second, the desire to engage with these technologies may reveal a deeper rooted issue.

    Instead of banning students from using these tools, we should question why they’re so attracted to them in the first place.

    How do we engender environments where students feel empowered to choose topics they are genuinely interested in, rather than mandate curriculum and assignments which lead them to cut corners, only learning how to use ChatGPT to accomplish tasks?

    And lastly in regards to employment, with headlines projecting the dire displacement of humans to AI in the workforce, we must remember our own hand in this: AI does not displace humans, humans displace themselves with AI. We get to, and should, mindfully orchestrate our collaboration which benefits us, not the machine.

    05. Growth & Identity

    As we interrogate and reinvent education itself, we’d be mistaken not to address another value prop of the institution. When the traditional college experience is no more, how do we kindle alternatives for social experiences and rights of passage?

    We need to find new ways for people to grow emotionally and find worth beyond school rankings.

    Ian Bogost puts it bluntly,

    “Without the college experience, a college education alone seems insufficient. Quietly, higher education was always an excuse to justify the college lifestyle. But the pandemic has revealed that university life is far more embedded in the American idea than anyone thought. [America is] far less interested in the education for which students supposedly attend. In the United States, higher education offers a fantasy for how kids should grow up: by competing for admission to a rarefied place, which erects a safe cocoon that facilitates debauchery and self-discovery, out of which an adult emerges. The process — not just the result, a degree — offers access to opportunity, camaraderie, and even matrimony.”

    But while colleges like to promote themselves as the destination where students go to revel in their newfound independence as young adults, what consistently takes place on university campuses, in many ways, does not resemble the nuanced realities of adulthood — nor does the four-year program fully educate graduates on how to engage with the real world or participate in its workforce.

    Who says the campus was the best method for growth?

    As we innovate across the board, we must examine emotional growth as much as we are concerned with intellectual growth.

    The School of Life is a shining example of an organization outside the institution engendering emotional growth through books, articles, games, events, videos, and counseling services. Ranging from self knowledge, relationships, and sociability, to work, calmness and leisure, The School of Life media platform is teaching the invaluable skills which must be a part of our education reform conversations.

    But we have to address more than just growth. College (and their rankings) provide a profound sense of identity, pride and self-worth — although also suspect.

    Author William Deresiewicz believes,

    “[A]n elite education inculcates a false sense of self-worth. Getting to an elite college, being at an elite college, and going on from an elite college — all involve numerical rankings: SAT, GPA, GRE. You learn to think of yourself in terms of those numbers. They come to signify not only your fate, but your identity; not only your identity, but your value. It’s been said that what those tests really measure is your ability to take tests, but even if they measure something real, it is only a small slice of the real.”

    Where else may this self-worth come from? And further, can we craft environments for experimentation and failure, rather than performance and perfection?

    He concludes,

    “If you’re afraid to fail, you’re afraid to take risks, the most damning disadvantage of an elite education. It is profoundly anti-intellectual. [...] The system forgot to teach [students], along the way to the prestige admissions and the lucrative jobs, that the most important achievements can’t be measured by a letter or a number or a name.

    It forgot that the true purpose of education is to make minds, not careers.”

    At this radical moment of change, we have the opportunity to reshape what we want from higher education... and we all have a role — makers, marketers, investors and thinkers alike. We get to author what comes to replace institutions as we know them.

    And while doing so, we should remember the moment before the figure and ground reversed: when education wasn’t for logos on LinkedIn, but for whatever made more fulfilled and engaged citizens.

    There’s a blank page in front us.

    Time to start drafting.

    A deep, heartfelt thanks to Robert Cain for the invaluable research, drafting and edit support in this piece.



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • Saknas det avsnitt?

    Klicka här för att uppdatera flödet manuellt.

  • The following is a summary of my 2023 SXSW Talk: Movements > Trends. Here’s Part II.

    Firstly — there is no such thing as “online culture” vs. “culture.” That’s the digital dualism fallacy kicking in.

    It’s just one in the same.

    But for the sake of common understanding — “online culture” in this instance is the fast culture memeified online discourse, which organizations are too often obsessed with.

    It’s a shift that occurred ~15 years ago.

    2007 was a monumental year for marketing.

    Facebook introduced Pages.

    Brands suddenly looked exactly like our friends.

    They weren’t.

    But nonetheless, brands saw the opportunity. And it was a glimmering one.

    “What do we have to do or say to feel like a friend?”

    Ever since the 00’s, brands have been seeking out material and excuses to join in online discourse across social — the perceived “hotbed” of culture.

    “If we win these discussions, we win culture... and then sales.”

    It’s uncertain if this notion has even been measured or supported, but was — and often remains — the collective hypothesis.

    Regardless, “trending” headlines and the meme of the moment became the focus for “friendly relatability.” Attempts to resonate and cut through, optimizing for attention, has resulted in an obsession: scan, track, measure, understand and activate upon whatever’s “trending."

    Brands say bae, express nihilism — are they depressed? — and are now seemingly... horny?

    Hashtags, challenges, and aesthetics have replaced the original intention of a “trend”: a meaningful social shift in human behavior.

    We’ve come to conflate “trending” with “trends.”

    In the process of chasing cool, most discussed “trends” are really just frivolous entertainment.

    We’ve lost the plot.

    Meanwhile, two other macro factors have helped further reverse the figure and ground.

    In a moment of chronic uncertainty, trends have become our “answers” — comforting explanations of what comes next.

    And simultaneously while culture also feels stagnant, trends have become our “progress” — comforting change.

    As a result, the number of published trend reports have roughly tripled since 2016.

    Trends are trending.

    And the trending is seen as trends.

    It’s a mess.

    Yet in primary research when asking 1,500 people globally if they’ve heard of ten “trends” — from Cottagecore and Barbiecore, to Indie Sleaze and Permacrisis — 43% haven’t heard of a single one.

    Utter “vibe shift” to the general public, and they’ll think you’re speaking a foreign language.

    ...Because you are.

    And meanwhile, for the 57% of people who have heard of one of the most discussed “trends,” less than half of those people have actually participated in any capacity.

    The vast majority of people have not heard of what cultural thinkers and strategists obsess over, and the general public isn’t doing anything with it.

    “Trends” as we currently know them are really only for ourselves. That’s fine... but for as long as we recognize they’re untethered from the real needs and desires of real people.

    These are empty vessels for us to fill whatever explanations we wish into them. They are our Rorschach tests. Cottagecore is whatever we want it to be... because it doesn’t actually exist.

    If our foundational task is to understand people, we’re way off the mark.

    For this reason, we need to break up with trends as we currently know them. It’s a toxic relationship.

    The critical caveat here is that understanding culture remains a priority, but the nuance is mistaking “trending” with substantial ideas worthy of strategy and investment. We must continue to study these signals, but with a dose of skepticism and healthy distance.

    If anything, they’re signals in themselves, not substantial shifts.

    Cottagecore as a viral, idyllic aspirational aesthetic is one thing. A sensibility. But we have to hold that in conjunction with the reality that this “trend” only applies to a fraction of a fraction of people... with minimal behavior being nudged.

    More precisely, there are three reasons as to why our current approach to understanding “online culture” requires a gut-check:

    Firstly, it’s exhausting.

    Sixty-four percent of people feel culture is accelerating. And that’s according to the consumers. How about the strategists tasked with keeping the pulse, analyzing and activating?

    We now have anti-trend trends, and currently #corecore — the trends have gone meta: online commentary about the absurdity of living online.

    We’re chasing “trends” which are inherently fleeting, and ephemerality has a notoriously low ROI.

    At SXSW this year, a leading social platform argued for the importance of “ephemeral trends” — we know what that means, right? When has investing in “temporary” ever been a sound business decision?

    This is simply an unsustainable and unwise practice.

    The second reason we need to interrogate our approach is because our current process is futile.

    Two-thirds of people believe brands are trying way too hard today. Even if a brand was to successfully chase down and capture the fleeting and act upon it, their mere presence undermines the outcome.

    As a brand, wrap your arms around something and you (often) kill it. That’s just how it works.

    But many still don’t want to accept this law.

    Take a look at r/FellowKids — the unfortunately still growing graveyard of cringe.

    Brand participation begets erosion.

    And for the brand who doesn’t mind the cringe and leans in regardless for engagements sake, psst... it’s still cringe.

    And the third reason we need to break up with “internet trends” as we know them is because these concepts are often inherently empty — devoid of meaning.

    Let’s go back to physics.

    Sorry.

    The equation for force is mass times acceleration.

    (F)orce = (m)ass • (a)acceleration

    Or more simply, force is calculated by the “weight” of something times its “speed.”

    Why do we care about force? Because culture is made up of forces: the crosswinds, efforts and influences of ideas and behaviors.

    For us to understand what to pay attention to, we need to be calculating “force.”

    But the problem is, we’re using the wrong variables.

    We’re failing physics.

    Of course we are.

    For the variable of speed, we have to recognize, today, everything is fast. Everything.

    Seventy-four percent of people believe algorithms can make anything go viral.

    In this context, speed is table-stakes. Anything new just moves fast. Fast is the norm. And as a result, we’re confusing speed with newness.

    Ironically, it’s perhaps the slower moving or sustained shifts that are more valuable to us today — the ones with prolonged energy.

    And for weight, today, everything is big. Everything.

    Again thanks to algorithms, everything has a trillion views. Fame is democratized and each piece of content can reach more people than the average blockbuster. Size is what’s distracting us.

    But size isn’t the metric we need to be paying attention to. Consider a balloon and bowling ball. Both are the same size, but very different weights. Remember, it’s weight that we’re after.

    We’re too often mistaking what’s trending for a machine with the real desires of humans.

    So our current working formula is:

    Force = Size • Newness

    We’re way off and exhausted.

    We need to go back to the original formula.

    Force = Weight (or the meaningfulness) • Speed (or the momentum)

    Or more simply, we need to focus on bowling balls over balloons.

    Balloons are cheap, pop or fly away.

    Why would that ever be a winning strategy?

    When the vast majority of people would prefer brands to “serve my needs by understanding what I care about” (70%) over “appears relevant by leaning into the latest trends” (30%) a new strategy is required.

    It’s about going back to basics.

    Physics 101: remembering the true definition of force — does this actually have weight and sustained energy?

    Psychology 101: remembering the human — does this actually mean something to a real person, not an algorithm?

    Business 101: remembering ROI — does this actually move a needle and is a sound investment of time, energy and resources?

    And if there’s no astounding “yes” to the above, let’s put it aside for the moment and just keep tabs on it.

    Not doing so is a disservice to our clients, ourselves and our industry.

    Survey Sourcing: “Modern Movements > Trends” Reddit & Attest, Jan 2023, n=1,500 (US, UK, FR & AUS)



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • Meet Liver King.

    He’s a media personality caricature repping the “all meat diet.” He chomps animal brains to win big in the attention economy, as much as he fights for the reassessment of what a more nutritious diet may entail.

    His success primarily lies in the former: attention.

    Many dismiss his honesty. There are countless videos “exposing” his regimen and potential steroid use. But it’s moot. Controversy only adds to his hyper-masculine mythology.

    His Carnivore Diet has been around for as long as the internet has. The pitch ranges from weight loss, increased energy, higher testosterone, and mental clarity.

    But several more drivers are now giving this “lifestyle” newfound energy.

    Firstly, it’s never been easier to get in touch with a tribe of like-minded thinkers. Often exposed via algorithmic means, an odd practice effortlessly reaches millions today. A video — or the mere thumbnail of one — is an invite for new, potential inductees. With this, we can now choose our own adventure of truth and determine what’s healthiest for us.

    Secondly, the attention around the all meat diet has risen with the larger adoption of veganism — also coincidently driven by health benefits. The blossoming of plant-based diets has allowed a counter trend to enter and thrive. It’s no surprise that we see the Carnivore Diet rage in a moment when meat-alternatives are increasingly finding their way onto menus.

    After all, many cultural trends are just tensions. Equal and opposite reactions. Trend. Counter-trend. Cause. Effect.

    Further, meat consumption also symbolizes status and mastery over one’s domain — one which is currently aflame and we’re hastily losing. Promoting one’s machismo dominance is also quite timely as we simultaneously evolve beyond a gender binary.

    Again: Trend. Counter-trend.

    Back to Liver King... A six pack, grizzly beard and bloody goat intestines appear to run counter to animal rights, environmental decline and gender fluidity.

    And here lays the ultimate overarching pitch and final driver to this all meat diet: identity and the community which comes along with it.

    You don’t even have to consume the raw liver.

    You just have to consume the content.

    The all meat diet is a starter pack of values.

    Worship him or ridicule him — either gives you the opportunity to express your beliefs, find a vocal role in this world, and bring you closer to those who feel the same about animals, the environment or gender.

    Modern Religions

    In Tara Isabella Burton’s book, Strange Rites: New Religions for a Godless World, she reminds us that religion is more than places of worship or mere deities.

    Religion can be anything that provides us meaning, purpose, ritual and community.

    An all meat diet is a religion.

    And Liver King is our high priest.

    Burton reports:

    “Back in 2007, 15% of Americans called themselves religiously unaffiliated, meaning that they didn’t consider themselves to be members of any traditional organized religion.

    By 2012, that number had risen to 20%, and to 30% when it came to adults under thirty. Now, those numbers are higher.

    About a quarter of American adults say they have no religion. And when you look at young millennials — those born after 1990 — those numbers reach almost 40%.”

    But while younger generations claim to be “less religious,” that’s not to say they aren’t rabidly seeking spirituality, answers or belonging.

    Definitions and modern examples of religion just haven’t caught up to the surveys.

    Outside of entertainment fandom, more glaring today: politics and social justice have become our loudest religious replacements.

    Helen Lewis, staff writer at The Atlantic puts it,

    “Many common social-justice phrases have echoes of a catechism: announcing your pronouns or performing a land acknowledgment shows allegiance to a common belief, reassuring a group that everyone present shares the same values.

    But treating politics like a religion also makes it more emotionally volatile, more tribal (because differences of opinion become matters of good and evil) and more prone to outbreaks of moralizing and piety.”

    Burton points out:

    “A full 72% of the Nones [those who are religion-less] say they believe in, if not the God of the Bible, at least something.”

    Today, righteousness is up for creative interpretation and gospel is co-written in the comments.

    Dogecoin Dogma

    Meme stonks and crypto provide hundreds of thousands moral meaning (giving power to the people), devout purpose (going to the moon or taking down “The Man”), steady ritual (buying the dip or “gm”), and passionate community (servers to subreddits). There’s a prophet: Satoshi, and sacred text: The White Paper. The very first block is even called The Genesis Block.

    This is all a profoundly deep, shared belief in something. A contagious energy. A shared spirit. There are morals and morale here — crypto is seen as a path to salvation, “the answer to all of humanity’s problems.”

    Bloomberg’s Lorcan Roche Kelly calls bitcoin:

    “The first true religion of the 21st century.”

    Karl Marx claimed that “Religion is the opium of the people,” but instead, modern religions are really the amphetamines of the people.

    Praying with Potter

    Harry Potter is perhaps the most established modern religion we’ve got.

    With a moral compass from shared sacred scripture, Potter has been offering a profound sense of belonging to the Wizarding World for a quarter of a century now. Potterheads take pilgrimages to Hogsmeade™ village at Universal Studios Orlando and congregate around their own interpretations of the new testament: fan fiction. The Hogwarts house system even provides specific denominations for even deeper affiliations.

    Endangering the ecosystem to pay their respects, fans have been recently urged to stop leaving socks at the fictional grave of Dobby at Freshwater West Beach in Wales.

    And since the very beginning, traditional religious groups have either attacked or compared the magic of the series to their own beliefs. Religious disaffiliation now also occurs when members reject its leaders’ own base actions.

    Bitcoin and Gryffindor are symbols of modern religions if we’ve ever seen them.

    In Sync

    For younger generations raised on remix culture, we see the stitching together of behaviors and content as new religions. And these religions also stitch us together.

    As Burton writes,

    “In his 1911 book The Elementary Forms of Religious Life, Durkheim argues that religion is basically the glue that keeps a society together: a set of rituals and beliefs that people affirm in order to strengthen their identity as a group. Religion is a ‘unified system of beliefs and practices which unite in one single moral community called a Church all those who adhere to them.’

    This church, furthermore, is sustained not through a top-down hierarchy, or through some invisible spirit, but rather through the collective energy of its adherents, a process he calls ‘collective effervescence,’ a shared intoxication participants experience when they join together in a symbolically significant, socially cohesive action.”

    From diets like all meat, to the absence of food like OMAD (one meal a day), to the slur-hurled cultish targets like Goop or CrossFit — the gospel of wellness grants opportunities for shared values, goals and rituals. These socially cohesive practices are Durkheim’s “collective effervescence.”

    And this religious collectiveness is a solve for Cultural Synchrony — cohesion and concurrence during a moment of social polarities and algorithmic segmentation.

    Modern religions sync us.

    Worshiping Workism

    Our “Great Resignation,” Anti-Work and Overemployed movements also check the boxes of modern religions.

    For the last two decades, as traditional religion declined and capitalism thrived, work stepped in as a seamless substitute. Blackberries and boardrooms as altars, we prayed for promotions. We went as far as replacing “career” with calling and passion. WeWork’s entire rise (and fall) can be traced back to Neumann’s religious aspirations.

    And with that, another component of religion is the leader. As Joe Rogan ironically points out,

    “There’s some weird thing about human beings where they gravitate towards a big leader [...] There’s almost like a cheat code.”

    From Musk and Trump, to Billy McFarland, Anna Delvey, Elizabeth Holmes, and Sam Bankman-Fried, the line between a charismatic leader, and cult of personality is razor thin. The exploitations of scam culture within the context of our yearning for modern religion is worthy of our mindfulness.

    Gary Vee is our “youth pastor of capitalism.”

    But only recently — with a pandemic, unemployment, and widespread WFH holding a mirror to this greedy, corporate faith, a catalyst for mass reflection — many have reconsidered this theology. In 2020, when governments legally withheld purpose from the masses, the vibe shift was underway.

    As a truly endless spiritual pursuit, millions more are now stepping off the treadmill toward dream job nirvana. Did it ever really exist, though?

    Arguably most influential of all, when the church is physically closed and our religious practice is reduced to a Zoom screen in an empty apartment without real socialization, we lose our religion.

    Loneliness

    While other religious stand-ins like QAnon, cosplay, K-pop, stans, wicca, astrology, anti-vax, Disney Adults, online sleuthing (think: Couch Guy detectives), or young men devout to their Bored Apes or DAO’s governance, all check the boxes of meaning, purpose, ritual and community — the most influential driver of our newfound spirituality is our loneliness.

    Above all, today’s modern religions provide community.

    According to multiple studies, 56% of Gen Z report “growing up lonely” (more likely than any other cohort). 59% of 18-29 year olds have “lost contact with friends” since 2020 (more than any other cohort). And 9-in-10 young adults wish they spent more time with “their community.”

    In the 90s, only 16% of Americans had two or less friends. Today, nearly a third of Americans have two or less friends. Since the 90s, there’s been a -20% drop in Americans who say they have one person to call their “best friend.”

    Perhaps because 30% of people claim they don’t even know how to make a new friend.

    Today, 1-in-5 Americans say it’s been at least five years since they last made a new friend. After all, the number one place Americans made new friends was... the workplace... oof.

    More concerning, in the 90s, only 17% of young men and women reported that their parents were the first people they talked to when confronting a personal problem. Today, for young men, that number is 45%... This is likely driven by so many of them still living at home — just another driver for this crave for connection.

    If it’s this hard to maintain friendship, we can only imagine the effects on intimacy. A grim study reports:

    “Between 2009 and 2018, the proportion of adolescents reporting no sexual activity (alone or with partners) rose from 29% to 44% among young men, and from 50% to 74% among young women.”

    This was of course before the pandemic worsened these numbers.

    So, what does this pervasive loneliness and lack of intimacy result in?

    Between 2009 and 2019, the percentage of teens who reported feeling “persistent sadness or hopelessness” rose from 26% to 37%. Fast forward through the pandemic and in 2021, those who feel persistently sad or hopeless are now 44%.

    Never more connected.

    Never more alone.

    There are countless drivers to our collective despair here, but they’re bottlenecked by our inability to discuss them with others. In other words: a lack of connection can cause our hopelessness, and our hopelessness from countless other social ills are harder to resolve when we don’t have anyone to talk them through with.

    In the face of isolation, what we get is a lust for collective effervescence. Trend. Counter-trend. Cause. Effect.

    Swifties celebrating album drop holidays, devoutly studying lyrics and album art symbolism, ritualistically purchasing tickets, and passionately defending their creed and preacher are some of the most meaningful social connections teenagers and young adults experience today. Their intensity goes as far as prompting the Justice Department to open up an antitrust investigation against Ticketmaster. The sway of Swifties changes laws.

    Sure fandom also existed decades ago, but decades ago people also had friends.

    Religion’s our current salvation for connection – an opportunity to transcend ourselves, link arms with like-minded users, and rekindle our spirituality in sync.

    From bonding over diets and shitcoins, to fanfic, anti-work and cryptic song lyrics, the pursuit of belonging is an impassioned, spiritual journey.

    Stability

    Belonging breeds stability — support. We’re no longer alone in this.

    Traditional religions previously built-in this stability and structure: from organizing people and routine, to the guidance in how we should approach our lives — direction.

    But as these institutions’ rigidity are questioned and dissolve, we seek out the grounding and answers elsewhere.

    As Burton writes,

    “Traditional religions, traditional political hierarchies, and traditional understandings of society have been unwilling or unable to offer compellingly meaningful accounts of the world, provide their members with purpose, foster sustainable communities, or put forth evocative rituals.

    And, in return, young Americans have lost their faith not simply in the tenets of a religion, but in civic and social institutions as a whole.”

    As previously explored in the rise of the paranormal,

    “Adopting a supernatural explanation is a flex of control.”

    “When so much is out of control, theory is a refreshing reclaim of self-sovereignty. Our own explanation beats the one that isn’t even given. Power is taken back.”

    Failed by the institutions meant to support us, we seek faith, support and answers elsewhere.

    Never before in history have fewer people believed in God but believe in aliens. Two-fifths of Americans think extraterrestrials have been behind UFO spottings, up from just one-third a couple years earlier.

    Aliens are explanations.

    No matter how far out they are, they at least bring us together, tether us and provide some answers:

    We are not alone.

    Hope

    The opportunity here is blinding.

    We require a new crop of organizations and figures to usher in refreshed meaning, purpose, ritual and belonging.

    But communicating religion as a solution is a kiddie band-aid floating atop a gushing wound. Gospel won’t cut it.

    The larger and more responsible opportunity here is to address the systemic causes of this social gash.

    Why is it that people are deficient in meaning, purpose, ritual and belonging today? And what’s the role of your product, service, brand, or organization in solving these needs?

    Hold up... Your deodorant, candle or cooking spray aren’t rituals. However... Community workout, candle-making or cooking classes can be.

    We have fundamental, existential crises to solve for. “Walking the walk” is what will ultimately attract the reverent and faithful.

    To understand why we’re at this moment with religion is to recognize that younger generations feel traditional institutions have been full of judgment, exclusivity, rigidity and uninspiring contradictions.

    It’s why they’re seeking alternatives offering the exact opposite: tolerance, flexibility, remix, choice, and a bit of levity (see: Birds Aren’t Real, another spiritual movement).

    In a moment of mass isolation, scare faith, and deep uncertainties, we’re at a turning point.

    We have the opportunity to solve deep human needs: finding a sense of purpose, feeling secure in one’s future, developing a strong sense of self, kindling a sense of belonging with others, and empowering action stemming from common goals.

    How do we help others achieve this?

    ...Talk about a purpose.



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • Part I:The Tension

    There’s a new dilemma.

    Only it’s not that “new” of a dilemma.

    At the beginning of this summer, decades of glacier-paced cultural change was captured perfectly in a single weekend. The top of the charts revealed our endangered media ecosystem.

    You’ve heard this song plenty before. Thanks to inclusion in Netflix’s fifth season of Stranger Things, Kate Bush’s 1985 song “Running Up That Hill (Make a Deal with God)” found itself back in the zeitgeist. It went from 22,000 streams per day to 5.1M. Momentarily, a 37-year-old track was the most streamed song on Spotify.

    Meanwhile, Top Gun: Maverick, a sequel to the 1986 original, broke box office records, banking $156 million the same weekend. This was right before Jurassic World stomped in — the seventh installment since 1993. Then came Minions 2 — a sequel and a spin off to the Despicable Me franchise, which in itself already had three installments.

    Further, in video games that weekend, 9 out of 10 best selling titles were from franchises. And the New York Times Best Sellers list saw James Paterson, the Guinness World Records holder for the most #1 New York Times bestsellers, taking up two of the top five spots in fiction.

    It was the summer weekend for big premieres. But in fact, nothing about these releases were particularly that new.

    Most noteworthy though, this pattern of mega-successful reboots stood against a backdrop of another story...

    These titles were released at a moment when more people are creating more content than ever before in history.

    Spotify boasts 70,000 tracks uploaded every day. YouTube is uploading 30,000 hours of new content every hour. Nearly 3M unique podcasts exist. Twitch is broadcasting +7.5M streamers, indie game releases and play are both growing year over year, and roughly 4M books are published annually in the U.S. — nearly half of those self-published, a +250% increase over just five years.

    On one hand, we have a booming Creator Economy, with an ever-expanding democratization of tools for production to anyone with an idea. So much so, that according to 1,000 surveyed Americans by Zine, 86% of people believe there is an overwhelming amount of entertainment available today.

    Yet meanwhile on the other hand, we seem to have also found ourselves culturally stunted. Our box office and streaming platforms are soggy with the same regurgitated franchises. Reboots rule the roost, and familiar faces hog our charts, while notable newcomers redefining genres feel few and far between. With this, 64% of people declare they are getting fed up with today’s reboots, sequels and remakes.

    What gives?

    How is it that during a moment of radical creator liberation and audience frustration, we’re finding ourselves with the same tropes and hooks?

    Chris Anderson’s 2006 optimistic Long Tail vision promised us that “specificity” — the shallow and obscure — would be economically feasible as the internet would connect the niche to its audience. Aggregators will win, the odd would thrive, and those on the edges would celebrate. Creators could finally connect to their 1,000 true fans.

    But as seen from the macro view, a diverse, bottom-up media ecosystem is in fact not thriving.

    Instead, the inverse is happening.

    Homogeneity is winning.

    Part II:Sameness Everywhere

    In an analysis by Adam Mastroianni, a postdoc scholar at Columbia Business School, “the same” keeps rising to the top — across all media.

    Simply, there are fewer winners.

    Mastroianni calls this our Cultural Oligopoly. “A cartel of superstars has conquered culture,” he writes.

    “Until the year 2000, about 25% of top-grossing movies were prequels, sequels, spin offs, remakes, reboots, or cinematic universe expansions. Since 2010, it’s been over 50% every year. In recent years, it’s been close to 100%.”

    “Since 2000, about a third of the top 30 most-viewed shows are either spin offs of other shows in the top 30 (e.g., CSI and CSI: Miami) or multiple broadcasts of the same show (e.g., American Idol on Monday and American Idol on Wednesday).”

    “In the 1950s, a little over half of the authors in the Top 10 had been there before. These days, it’s closer to 75%.”

    “In the late 1990s, 75% or less of best selling video games were franchise installments. Since 2005, it’s been above 75% every year, and sometimes it’s 100%.”

    Software engineer Azhad Syed identified the same “Cultural Oligopoly” in his analysis of the music industry.

    “The number of different artists that crack the Top 100 is decreasing over time. In conjunction with fewer and fewer artists on the charts, each of those artists is charting 1.5x to 2x as many songs per year.”

    Meanwhile, “old” music — defined as having been released more than 18 months — now accounts for 72% of the market in the U.S. And though 18 months is admittedly a flawed definition of “old,” more widely, the consumption of old music is growing, while demand for new music is also declining.

    In assessing this record for The Atlantic, music critic and historian Ted Gioia writes,

    “Never before in history have new tracks attained hit status while generating so little cultural impact.”

    The old is winning financially, but it’s also winning creatively. Rolling Stone Magazine forecasts the continued rise of “interpolations” — the cousin of sampling in which song structure is borrowed and made “new.”

    “Don’t expect interpolations to slow down anytime soon — rather, the total opposite is likely. Publishing companies are sitting on mountains of instantly recognizable songs [...] Now that the business is focused around streaming singles, they have a chance to juice them once again.”

    As a result, the hottest private equity investments as of late have been the publishing catalogs of accomplished artists. In fact, according to VP of Business and Legal Affairs at Sony Music Publishing, Dag Sandsmark,

    “The world’s largest music publisher has received twice as many requests for samples and interpolations from its catalog two years in a row.”

    Which translates to this: today, from film and TV, to books, video games, and music, there’s statistically less diversification rising to the top. And while it’s given that everything in culture is a remix, the intensity of today’s reliance on what’s come before seems worthy of our attention.

    What’s causing this systemic malfunction?

    Part III:Causes of Creative Collapse

    01.Conflicting Ecosystems

    Most obviously, we’re discussing two very distinct and seemingly competing media environments.

    For creators, there’s the bottom-up, democratized access to tools, enabling massive amounts of content to be made and syndicated frictionlessly. In the Creator Economy everyone can be a player and “make it.”

    On the other hand, there’s the top-down, institutional power of filtering and recommendation, held by establishments incentivized by outsized financial returns. Large, risk-averse institutions — arguably just run by in-house lawyers and accountants at this point — play it safe to “protect shareholder value.”

    These divergent models are currently inconducive. It’s this fundamental dynamic that sits center stage at our paradox.

    When there are two drastically different sets of environments, incentives, and breeds of “Creators” today — everyday maker vs. established institution — it’s hard to expect normies to be plucked out and be bet on by gatekeepers already in power.

    02.It’s (Mostly) Trash

    Then there’s the question of quality.

    While the Long Tail is certainly diverse, it’s also made up of a lot of... noise. Amateurs are amateurs, no matter how many there are.

    A reason we don’t see new creators’ work rise is simply because the majority of it isn’t even worthy (or because there’s just too much to sift through).

    Another angle here is the lack of funding for emerging creators, fueling pursuits. For a young, talented artist today, where are grants or opportunities for backing outside of peer crowdsourcing?

    In the absence of infinite time but facing infinite content, we actually need some gatekeepers. Further, we need financing for those who aren’t... trash.

    03.Institutional Consolidation

    By its very nature, the Long Tail of content is segmented into ever-smaller pieces for ever-more discerning audiences. But as the Long Tail lengthens where more create, the classic bell curve forms: the obscure gets more obscure, while the largest common denominator gets more... basic.

    Look no further than Netflix’s most recent pivots, which make it clear they’re no longer interested in many, risky, artistic bets, but instead, “Bigger, better, fewer.”

    Ironically, this is no different than what preceded them. Also, Netflix was once seen as the promising example for the opportunity of the Long Tail. Instead, over the last decade, Netflix has been slashing its library of titles. As of 2010, Netflix housed 6.7K films. Today, a decade later that number is down -45%.

    Much of today’s mass-produced work aims to satisfy the average. As a result, we’re left with average. The middle is saltine-cinema: the largest financial opportunity.

    Take or leave Martin Scorsese’s critique of Marvel, his take on the state of film — this “consolidation” — shouldn’t be controversial:

    “The art of cinema is being systematically devalued, sidelined, demeaned, and reduced to its lowest common denominator, ‘content.’”

    This dovetails with one of Mastroianni’s own hypotheses for today’s Cultural Oligopoly: a systemic reflex towards concentration. The big habitually eats the small. Movie studios, music labels, TV stations, and publishers of books and video games have all consolidated. And this concentration is simultaneously occurring across religion, political parties, language, top visited websites, newspapers, cities, and most discussed: wealth and businesses.

    The winners we’re left with today are so large, they have to satisfy that largest possible common denominator in order to survive.

    04.Medicinal Nostalgia

    More choice isn’t always a good thing.

    In The Paradox of Choice, Dr. Barry Swartz writes,

    “The fact that some choice is good doesn’t necessarily mean that more choice is better.”

    Presented with an avalanche of opportunity, especially with entertainment — something meant to bring joy — we stick to what we know. After all, what we know feels good.

    In Derek Thompson’s book, Hit Makers, he explains this tendency,

    “Most consumers are simultaneously neophilic — curious to discover new things — and deeply neophobic — afraid of anything that’s too new. The best hit makers are gifted at creating moments of meaning by marrying new and old, anxiety and understanding. They are architects of familiar surprises.”

    As creatures of comfort, the unknown is scary. We opt for the familiar. And this selection solves two things for us: choice analysis paralysis, and emotional turmoil. We’re relieved.

    A collective longing for the past might explain this paradox’s recent acceleration. The pandemic has triggered a mass re-appreciation of the old — safe and unthreatening. We also had plenty of time to explore catalogues while production was paused.

    As Dylan Viner, Managing Partner at TRIPTK, a cultural research and brand consultancy, points out,

    “Hateful of the present, and fearful of the future, we long for the past.”

    And this is true even when the past isn’t our own. According to Spotify’s own research, 68% of Gen Z enjoy media from prior decades because it reminds them of when times were simpler.

    History is a reassuring comparison to the unpredictability of what tomorrow may bring. So what if it’s played out? At least we know the words.

    05.Influence of the Aged

    As users of today’s media platforms grow older, so too will the age of the content being consumed.

    According to market research by Ampere Analysis, Netflix's core subscription base is already saturated with 18-34 and 35-44 viewers — 80% and 70% respectively. Growth within these age brackets has been stagnant. Now, Americans 50+ are driving the growth of the entertainment juggernaut, which now inevitably must keep luring and satisfying these older consumers.

    The growth of mature users on today’s content platforms recalibrates what’s surfaced and produced, diminishing the attention placed on the youthful, emerging or rebellious.

    06.Platform Persuasion

    We, the audience, have less control over this paradox than we think. Chris Dancy, an author and speaker on living with technology, remarks,

    “Technology has moved from Big Brother to Big Mother... Our quest to create the most frictionless experience is leaving people devoid of autonomy and longing for the feeling of 1st person living.”

    Faced with AI-customized playlists, “We think you may like” recommendations, and “Just for you” nudges, it’s up to once agnostic platforms to determine what’s now streamed... and popular.

    Algorithms were believed to untether us from the masses and offer paths for personalization, but instead what’s pushed today still emanates from the original institutional power we thought we were escaping.

    The consultancy Music Tomorrow researched how playlists are impacting emerging artists and found:

    “Over the last four years, major labels accounted for nearly 70% of the music featured in the ‘New Music Friday’ playlist on Spotify, 86% for ‘Rap Caviar’ and 87% for ‘Pop Rising’ playlists. Even though making and releasing music has become easier than ever, the support of a major label — and its marketing powerhouse — is one of the top determinants (if not prerequisites) for getting access to some of the most valuable streaming real estate.”

    Further explaining our disillusionment with algorithmic personalization and escape from the mainstream, Rob Horning of Real Life magazine writes,

    “Streaming services work strenuously to shape customers' disposition toward consuming if not specific tastes, making them more passive in their consumption, more willing to go along with what is trending and what is being surfaced on landing pages and home screens. It's no accident that searching these sites for something to watch is often an arduous and fruitless chore, inducing a learned helplessness and a pre-emptive predilection to surrender to the feed.”

    “For You” is not about pushing the limits of our artistic palates, as much a device to serve us what the platform projects — and wants us to be satisfied with, herding us into more predictable siloes that can then be targeted with more “precise” recommendations. Anything other than this is a liability to the business model.

    Ted Gioia also comments on this practice,

    “Algorithms are designed to be feedback loops, ensuring that the promoted new songs are virtually identical to your favorite old songs. Anything that genuinely breaks the mold is excluded from consideration almost as a rule. That's actually how the current system has been designed to work.”

    Algorithms’ are not designed to radically free us through superior discovery. They’re made to categorize us into more predictable buckets with predetermined labels. “New” is just a wrench in this machine.

    07.Creators ≠ Consumers

    Finally, a less validated hypothesis as to why we have an unrecognized Long Tail is perhaps that we find more value in making than consuming.

    With the tools of creation democratized, it’s never been easier to produce... but that doesn’t insinuate there’s a proportionate eager audience.

    Many may find more value in the creation of work, than the discovery and consumption of it. And further, much of today’s “creation” is in fact informed by existing IP.

    Kevin Alloca, YouTube’s Global Director of Culture & Trends, explained to me,

    “So much of user generated content today is still about a franchise — reactions, reviews or remixes. While there’s never been so much creativity and content, it’s not to say it’s all entirely removed from existing IP. It's fairly common now to see more media about, or related to the original work and consumption of it, than there is of the original work itself.”

    We’re duped by a mirage of new media today. In fact, we’re in a feedback loop of meta reactions. It’s a reaction video to a movie trailer which is a sequel, or a walk-through stream for the latest in a video game franchise. We have videos about a video about a video. Today, so much of bottom-up creations are nodding to legacy material. There’s less originality out there than we perceive there to be.

    In a culture where creating can be more satisfying than consumption, we’re left with a glut of both unwanted content, and new content that’s actually just about existing content.

    Part IV:We’ve Got A Problem

    So we’ve established the existence of this creativity paradox, along with its spread, intensity and many causes. Hands turned up, we can accept this is just how it all turned out. So be it.

    But we can’t. Absolutely. Can. Not.

    We’re in a corrosive media ecosystem where the top 1% of bands and solo artists earn roughly 80% of all recorded music revenue — and by some estimates, the share for established artists is only getting larger. If this trend continues, we risk sabotaging both the opportunity and incentive for new artists to even participate.

    As Mastroianni writes,

    “Movies, TV, music, books, and video games should expand our consciousness, jumpstart our imaginations, and introduce us to new worlds and stories and feelings [...] Learning to like unfamiliar things is one of the noblest human pursuits; it builds our empathy for unfamiliar people.”

    If we stop watering “the new,” the new will die. Substitute “new” with whichever genre or medium you prefer. It’s the (originally) foreign, weird, edgy, counter-cultural and antagonistic that drives a healthy society forward. Or for another metaphor: suffocate today’s sparks of “the new,” and we’re left in the dark.

    We lose.

    For Gioia, we must breathe life into the Long Tail as it “creates a more pluralistic, diverse and multifaceted society.”

    The Long Tail vision hinted at making the blockbuster less pronounced... but the exact opposite occurred: the fringe is now endangered.

    So what can we do?

    Part V: Solutions

    01.Acknowledge New vs. New For Them

    We need content from today and for today.

    For Adrian Hillekamp of A&R Management with Concord,

    “Every generation needs its soundtrack and that can’t come from a back catalog. It has to come from the time, the moment, and have a particular feel.”

    And yet in 1986, the R&B singer Ben E. King had his second #1 chart placement with “Stand By Me,” a full 21 years after it first topped the charts. Propelled by the success of the film of the same name, King’s sudden reappearance on the chart was as unexpected and remarkable as Kate Bush’s renaissance this summer.

    Compounding the similarity to today’s phenomenon: Ferris Bueller’s Day Off, famously capped by a parade-float performance of The Beatles version of soul hit “Twist & Shout,” and the same-summer soul-soundtrack of boomer nostalgia vehicle The Big Chill, which was eaten up by kids and their parents alike.

    Perhaps content for today can be existing content that some just discover today. Contemporaneity isn’t a guarantee of innovation or inspiration. The current spike in interpolation isn’t novel. In fact, it might actually be cyclical. But that doesn’t mean it’s any less new to eyes and ears experiencing it for the first time.

    But this is not from today. We need that too.

    One simple way to break out of our cycle of reboots is to distinguish between what’s net-new vs. what’s new for a new audience. Multigenerational third-acts of media aren't all that bad... but for as long as there's also content produced from today.

    02.(Re-)Build for Search & Exploration

    An underlying problem in 2022 is not a lack of great talent — it’s just that we can’t easily find it.

    The discovery, curation, distribution and amplification of quality content desperately needs a reassessment. In 1986 it wasn’t possible to discover everything. Today we have the technology to move far beyond traditional, monocultural points of discovery. But we seem to have stopped leveraging it — hypnotized by convenience, preferring to receive rather than search.

    An easy dismissal here is that the market will do its work — award the deserving — the good will inevitably rise to the top. But that’s simply not the case.

    The Long Tail is failing to identify, and connect the fringe to its eager audience. Ask yourself: when was the last time you found a new artist that you became an instant fan of? Was it an easy journey? Can you find your new favorite emerging author this afternoon — effortlessly?

    We can push our systems further. A manual override is required, this time hand on the upstarts’ side of the scale, ensuring increased reach. This is not to make all artists “mainstream,” but to connect more potential fans to their next favorite creator.

    In our research, 64% of people trust a streaming platform's recommendations to surface content which they’d enjoy. But, 3-in-4 people believe streaming platforms can still do a better job at surfacing unpopular entertainment which they may enjoy. The kicker: 62% want streaming platforms to recommend more unpopular content... even at the risk they may not like it.

    We’ve solved the barrier to entry, but we still haven’t cracked the barrier to discovery.

    Historically, one could effectively work their way up the Long Tail with ad spend, but today, so much congestion makes this tactic futile.

    Thomas Klaffke, Head of Research at TrendWatching, connects our paradox to Kasey Klimes’ thoughts on the opportunity to Design for Emergence (or really just "Design for Exploration.”)

    “In design for emergence, the designer assumes that the end-user holds relevant knowledge and gives them extensive control over the design. Rather than designing the end result, we design the user’s experience of designing their own end result.”

    Rather than surfacing the same to all, platforms should trust their users to chart their own discovery paths — and not exclusively by disembodied, algorithmic means.

    For today’s creators, we need to redesign the on-ramps for potential audiences. And for consumers, we need to ask: what does falling down the rabbit hole of exploration feel like when it's actually enjoyable and not against our own will?

    03.Rewrite the Rules for Top-Down Risk

    Anita Elberse, a professor at Harvard Business School and author of Blockbusters — an analysis on this very phenomenon, simplifies our predicament,

    “Of course I understand concerns about the diversity of content, and the fact that certain elements people like are disappearing. But overall I'm not that pessimistic. It's not a hobby, it's a business.”

    Again the truism: “the market will do its work.” But what Elberse and others at the top fail to remember is that risk and diversity can drive business. And further, we’re in control of these business decisions. We set our own rules here. “It’s a business” is the opportunity, not the excuse.

    We found that across all age demographics, 81% of people say they want entertainment to better reflect unique experiences and tastes similar to their own, while 76% want TV, film and music producers to take more creative risks in what’s produced today.

    Risky is safe.

    We can reward creative risk-taking at an executive level. We can incentivize creative moonshots, and financially or emotionally support the underdogs. We can satiate unknown or unstated appetites by creating crowdsourced competitions or allocate funding for student works. We can iterate upon models to make the Long Tail even more financially appetizing.

    Some of the most beautiful works this year — Turning Red, Everything Everywhere All At Once and Marcel the Shell — are template and trope-defying pitches. They’re pure outliers. But their creative (and financial) success is in part due to the fact there are no comparisons to them. Creative differentiation, in itself, is a winning strategy. A24 for one has zagged, refreshingly embracing the financial upsides of originality and friction.

    Back to Scorsese. As he explains,

    “[Today’s films] lack something essential to cinema: the unifying vision of an individual artist. Because, of course, the individual artist is the riskiest factor of all. [Historically], the tension between the artists and the people who ran the business was constant and intense, but it was a productive tension that gave us some of the greatest films ever made. Today, that tension is gone, and there are some in the business with absolute indifference to the very question of art and an attitude toward the history of cinema that is both dismissive and proprietary — a lethal combination.”

    What plagues the music industry is no different. Today, some artists and insiders fear even listening to unsolicited demos, which could make them vulnerable to future lawsuits. Example: If and when a hit of theirs coincidently sounds similar to something they’ve heard previously — like the landmark Robin Thicke, Pharrell, Marvin Gaye “Blurred Lines” case.

    We find ourselves in a moment where some are beginning to shy away from creativity.

    Nick Littlemore of Australian trio Pnau agrees:

    “[Today, culturally] we’re afraid of new ideas. They’re not road tested. So we’d rather do something that maybe has a little bit more of a guarantee of being successful.”

    What does tip-toeing around “the new” do to a generation? Nothing good, at a macro level.

    Creativity must be seen as a freedom, not feared.

    04.Reframe Success & Reevaluate the Charts

    We need to shout from the rooftops that it’s okay to be a creator without a billion views. Mr. Beast and Dobrik-fame is singular — not remotely available to all.

    Deciding to write a newsletter for 100 people is not just okay, but an incredible feat. Conversely, optimizing for attention to mimic and (un-)intentionally contrast ourselves to institutional celebrities at “the top” helps no one.

    We must reevaluate reach, views and ad-revenue as our go-to metrics of success, and instead aim towards the worth of depth. Call it the invaluable intensity of love. After all, we only care about what we can measure. And passion is a murky metric.

    Why again do we still have award shows? Research reveals, barely half of people believe top music charts and box office numbers accurately reflect the quality of today's entertainment. Surprisingly, it’s younger generations who are more likely than anyone else to trust these charts. Why? These audiences are dangerously more impressionable...

    According to Sari Azout and Jad Esber,

    “For the creator middle class to rise, we need to see higher resolutions of taste preference and a breakup with singular, discriminatory platform algorithms and the opinion of the ‘few’ that arbitrate taste and force today’s dominant aesthetic. With that, individuals can decide on ‘what’s best’ for themselves, allowing for the talent power law to play out across more taste vectors and spreading the opportunity to be perceived as ‘the best’ — and, with that, spreading the opportunity to profit from that.”

    We face a massive opportunity to rewrite not just the rules to incentivize risk, but to also redefine what a “successful creator” looks like.

    Culturally, we’re stuck on traditional metrics (more money or more eyeballs), and stuck on a traditional lineage through traditional milestones of success. Even for new creators today, legacy occasions like Late Night interviews, brand endorsements, commercials, SNL performances, award show trophies, and even YouTube Play Buttons are still seen as the aspirational markers of success. Why?

    We’re mistakenly still using dusty indicators of success in a contemporary media environment.

    Where are the awards celebrating the small and mighty? Who are the megaphones to draw more attention and financing toward marginalized creators? Where is the campaign reminding us that creativity dies in the shadows of reboots, and that merely making something is a celebration in itself?

    As Mastroianni shared with me,

    “It’s a naive and optimistic thought to think that the Long Tail is meant to fairly compete with Tom Cruise and that he should be dethroned on the same chart by a TikTok. In reality, he loses quite often, however it’s just not clearly documented [...] Perhaps this is a story about a continued and questionable value of ‘the charts.’ They don’t reveal the whole picture of what’s happening in culture.”

    We need new charts, new records, and new metrics to compare ourselves to, and more importantly, more healthily reach towards.

    05.Seek the Odd with Bottom-Up Risk

    We’re not off the hook here. We entertainment consumers must also be held responsible.

    For “consumer risk,” this means being open to a little bit more experimentation. Mix things up. Diversify your media diet. Foreign subtitled B&W documentaries are not required, but repeating familiar patterns or acquiescing to the algorithm should be exercised out of our habits.

    We can’t expect diversification if we first don’t at least taste it, signaling a desire for it. Scorsese's qualms with Hollywood today are rooted in this dilemma.

    “If people are given only one kind of thing and endlessly sold only one kind of thing, of course they’re going to want more of that one kind of thing.”

    To break out of this system, it’s up to us to express interest in anything other than Fast & The Furious 30. We can have that... but also more.

    It falls on the audience as well to zag, making demand for the unique unquestionably clear.

    Part VI:The Cliffhanger

    When reproduction is rewarded, monotony becomes omnipresent.

    When creativity — or lack thereof — is primarily driven by financial returns, risks are minimized. Audiences are left malnourished.

    And when the fringe doesn’t reach its audience, it dies without attention. Newcomers question the system, and bow out before even trying.

    Relevancy and reach are at tension. We must find ways to strike equilibrium. And if we don’t, our future literally becomes our past.

    We face a daunting opportunity to better support the niche, and introduce the new to its awaiting fans.

    Mind you, this is not a declaration to kill off the Minions, but a recognition that we have equally enjoyable — and richer — content waiting for us, just without direct lines of access. Thankfully, this is not an all or nothing scenario.

    We can have Tom Cruise mega-hits and freaky, indie artists both thriving concurrently. And moreover, superhero installments with provocateurs exploring underrepresented communities or toppling taboos in more nuanced ways.

    We have choice. Choice in what we consume. And choice in whether we author new rules to get there.

    For as long as we remain mindful that there’s more out there to enjoy...

    So let's choose for ourselves. And celebrate others who do so for themselves.

    Thanks to Ben Dietz, Josh Chapdelaine, Jad Esber, Adam Mastroianni, Ted Gioia, Kevin Alloca, Sarah Unger, Dr. Marcus Collins, Dylan Viner and Chris Dancy for their ideas in both expanding and distilling this analysis. Thank you.



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • The META Trends are invaluable in identifying where the collective, trend forecaster psyche is at.

    But as we learned in a five year look back: biases thrive, agendas direct, risk is feared, quantification is scarce and toxic optimism influences.

    Deeper, as we learned in a series of exercises with AI: analyzed cultural data reveals what we humans think is most important, may not actually be the case.

    All of this META Trend work is predicated upon industry trend reports... which, as we’re learning, may not be as dependable as we once hoped.

    The META Trends are insightful, but they and the industry reports used to get there, leave us with an incomplete picture of what’s driving culture forward.

    Only with friction, daringness and originality, can we analyze the sharp edges and fringes of culture that have influence. The weak, the uncomfortable and the complex help color our picture of the future.

    As we uncovered, AI can be helpful in discovering overlooked micro-trends which were hidden within the one million words of analyzed reports. However, many of these discoveries are things: Gut Health, Fluid Fashion, Privacy Enhancing Tech, etc.

    What we also need to augment is our ability to identify more nuanced, emotional overlooked trends. But this is a task a human can do best. Creative extrapolation is our superpower.

    So to identify the overlooked, we can use the META Trends as filters seek out what’s not surfaced.

    But we can also use these META Trends in another way...

    Sarah DaVanzo and I created a framework to spin out unique perspectives of any existing trend.

    4X Interrogative Questions_ To Identify the Overlooked

    * Outside = What is an outsider’s POV or experience?

    * Other Side = What is the inverse or contradictory tension?

    * Dark Side = What is the malicious or distressing angle?

    * Back Side = What is the devious or inappropriate twist?

    Interrogating non-obvious dimensions of even the most trite, overly reported trends can reveal new ideas, threats and opportunities.

    For example, let’s use the most reported trend for 2022: Eco- Everything: a continued obsession with sustainability and an integration of green-thinking into all products and services.

    Applying the 4X Interrogative Questions we get:

    Outside =

    How are those on the equator signaling new norms of climate migration? → What does this reveal about the effects of climate on the less prepared, mobile or privileged?

    Other Side =

    How do consumers reckon with still opting in for two-day shipping amidst climate marches? → What does this reveal about a fear of sacrifice and collective cognitive dissonance?

    Dark Side =

    How are therapists managing to counsel those with onset climate anxiety, a new diagnosis? → What does this reveal about the spillover, emotional toll of something once believed to just be a physical crisis?

    Back Side =

    How do we account for the carbon footprint of online porn? → How can we speak to the stigmatized and uncomfortable drivers of humanitarian risk?

    By using this framework, we can open the door to new, often overlooked components of any cultural discussion.

    We net out with valuable trailheads to then explore.

    Call them insights, counter-trends, or just components of the original trend itself — it makes no difference. These are simply elements of culture that should be acknowledged.

    To continue this exercise, let’s go through all 14x 2022 META Trends to reveal some critical, often overlooked pieces of the puzzle.

    01. Eco- Everything ♻️

    Overlooked = Sustainable living is unaffordable for many, climate migration will be unachievable for a growing elderly population, and paper straws and PR plays are jokes to Gen Z

    02. Digital Default 🌐

    Overlooked = 27.6M U.S. households still don't have home internet, motion sickness and wanting to know what’s behind us still curbs VR adoption, and our desire to experiment with identity runs deep

    03. xX~VIBES~Xx 🍄

    Overlooked = Indigenous communities are being destroyed from drug tourism, vibe-therapeutics only address the surface level of social trauma, and bad trips and high-THC will ironically exacerbate mental health issues

    04. Radical Inclusivity 🌎

    Overlooked = Many are frozen, genuinely unsure how to appropriately participate, deadly racism remains omnipresent and largely unaddressed at an institutional-level, and are dating apps designed to facilitate discrimination with race filters?

    05. Kid’ing 🪁

    Overlooked = A generation just skipped a pivotal developmental period of play, humor and nuance (our greatest assets) keep dividing us, and joking amidst a backdrop of complete devastation can feel complicated

    06. Home Hubs 🏠

    Overlooked = For some, more time at home means more abuse, young adults can’t even afford a “Home Hub,” and as co-living thrives, brining home sexual partners — which is already declining — becomes even more difficult

    07. Algo_Minded 🧠

    Overlooked = Parents struggle to limit screen time, which stunts young adults’ social lives, McMindfulness has capitalism seeping into our wellness and dreams, and analog sex and companionship remain undefeated in brining mental relief

    08. Renews & Reinventions 💭

    Overlooked = It’s a privilege to drop out —  but those who most want to can’t, what happens when YOLO savings run out and startups fail?, and how quickly will employees return to a 9-to-5 in-office job when there’s economic uncertainty?

    09. Virtual Valuables 🔑

    Overlooked = Almost everyone wants to be an expert and “new things” are an easy topic, grifters can find ways to infiltrate and exploit any technology, and there remains scarce practical case studies of “world-changing tech” here

    10. Now! Now! NOW! 🛒

    Overlooked = You only lose from bad commerce experiences — rarely winning from average or good ones, deals are the only consistent predictor of loyalty, and predictive commerce may be too fast and uncomfortable for many

    11. Me Inc. 💼

    Overlooked = People with personal brands are struggling to launch personal businesses, financial illiteracy is failing us, and sex-work is minting a new class of millionaires overnight

    12. Sounds Good 🔊

    Overlooked = Overcrowded growing cities are victims of noise pollution, how do the 360M hearing impaired globally participate here?, and erotic audio is soothing millions to bed

    13. Op-purr-tunity 🐩

    Overlooked = We are seeing pets as replacements to children and spouses, stigmatization and restrictions remain across establishments (parks, bars, theaters, etc.), and animal abuse has risen alongside pet adoptions

    14. Feed University 🎓

    Overlooked = The line between expert and amateur is razor thin, academic institutions are struggling for relevance, and this is only jet fuel to our existing mis- and disinformation dilemmas

    These overlooked components are not comprehensive by any means — and arguably some may be obvious — but they offer a splice of critical nuance that’s missing from our daily trend conversations.

    We consistently need tensions, devil’s advocates and contrarians to see the full picture.

    Mindfulness of blind-spots, a willingness to both challenge and expand upon worldviews, and respect with interrogation, are the imperative and often missing traits required to more comprehensively grasp the zeitgeist and author a preferred future.

    Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • After Sarah DaVanzo and I leveraged NWO.ai’s invaluable AI to score and re-rank the META Trends, we were left stuck with one finding:

    Both the global and U.S. AI data-driven ranks were significantly different from the original human rank. The AI declared that what we humans thought was most important was not actually the case.

    Were we just splitting hairs of importance here, or were these divergent rankings a signal that our META Trends (which came from source material) were not as important as we once thought? Maybe more influential cultural shifts are out there waiting to be exposed.

    And if so, how can we find them?

    We debated important but missing META Trends for weeks — but, how important could these be if the experts couldn’t agree upon their importance by not collectively highlighting them within their reports which we analyzed? But simultaneously, according to our work analyzing the last five years of META Trends, the “most important trends” being reported haven’t changed much.

    There was no denying, though: important, nuanced cultural shifts were missing from our list of 14 META Trends. So, how could we identify and highlight these overlooked trends... and further, in a way that isn’t subjective (Sarah’s opinion against mine)?

    We considered just naming our favorite cultural phenomenon not included in the original META rank, or we could have just surfaced interesting leftover trends from the 40+ reports that didn’t make their way into one of the 14 META Trend themes, but both approaches would have thrown us into the same trap which we immediately called out after publishing the most recent annual Meta Trend report: the prevalence of bias and scarcity of risk in the trends and foresight field is concerning at best...

    While Sarah and I both have historical proof and a pedigree of accurate trend forecasting, our life experiences and methods differ. Just listing our favorites felt too qualitative. So we designed another experiment with NWO.ai.

    Experiment 04.AI META Trend Identification_ Comparison

    Left on its own, could AI identify similar or different — perhaps missing — META Trends?

    This time we fed all of the text from the original 40+ sourced trend reports into the NWO.ai AI platform. Nearly one million words of text. We figured that the AI could process this information with a different, extraordinary comprehension than us humans, who attempted to do the same when creating that original 2022 rank. We hypothesized the AI would make more connections — ergo identify META Trends completely overlooked by the humans.

    By crunching all of the reports and instructing the AI to identify META Trend patterns (clusters, themes, etc.), would it come back with missing valuable, social shifts? Answer: Not even close.

    We were very wrong to believe AI could complete this exercise similar to that of an expert trend spotter.

    From the one million words of text inputted, the AI used Natural Language Processing (NLP) and clustered like-with-like, arriving at 72 clusters of “trends.”

    Interestingly, there was very little overlap with our 14 META Trends — a handful at best, which were really just optimistic stretches.

    Further, the AI’s clustered “trends” weren’t even trends, but rather general topics like “technology” and “pandemic.” It’s not to say that these themes weren’t impressive — they were — but these findings aren’t helpful to an experienced cultural strategist who can arrive at more provocative groupings.

    So to answer the question:

    Could AI identify overlooked META Trends: No.

    But feeling we were onto something we asked a follow up...

    Experiment 05.AI Micro-Trend Identification_ Extrapolation

    Rather than identifying large patterns which we’d call META Trends, could we use the AI to identify and rank smaller, perhaps overlooked micro-trends from within the reports?

    To figure this out, instead of having the AI merely organize the reports’ text, we instructed the AI to take its newly created meaning of the one million words (i.e. it’s 72 clusters) and use diverse internet data sources to measure and rank each and every signal. The goal of the experiment was to understand what the consumer energy is behind every micro-trend and rank them accordingly. We’d called these the AI-identified trends.It was a complicated process, but essentially we asked the AI to take the signals it captured from the 40+ industry reports, use them as a launching off point, and then use all the available online information to easily rank and validate them.

    The AI came back with 1,062 newly scored micro-trends.

    This was a ranking of AI-identified, human-overlooked trends, via abstracted meanings and associations all from the 40+ reports’ text. It turned up gold.

    Precisely, these were trends buried — or, hidden — within the industry reports that the AI pulled out using advanced NLP techniques and a vast amount of data.

    Here is a curation of the top 40 ranked, overlooked micro-trends discovered by the AI:

    Inclusive Insurance, Disruptive Winds, Food Inflation, DAO’s (Decentralized Autonomous Organizations), Rising Energy, Super Apps, Longevity Food, Unisex Fragrance, Dating Fatigue, Clothing Rental, AI-Music, Biodynamic Farming, Gender Affirmation, Rainwater Harvesting Systems, Wearable Robotics, Dopamine Dressing, Sleep Coaches, Financial Coaches, Gut Health, Caregiver Leave, Self-Hypnosis, Alcohol-Free Beer, Psychoactive Tea, Sperm Freezing, Touch-Free, Post traumatic, Carbon Pawprint, Sustainability Calculator, Land Stewards, Privacy Enhancing Tech, Anonymous Marketplace, Subscriptions, Fluid Fashion, Land Availability, Period Products, Paid Menstrual Leave, Mutual Aid, Workplace Conditions, Banned Advertising, and Media Anxiety

    Perhaps most noteworthy: “Russia Initiative” was a buried “trend” identified by the AI from the structured text of the industry trend reports. The AI then used various online data sources to measure and score the energy behind this (and all of the other 1061 signals). While the AI picked up this shift, not a single report explicitly mentioned a pending war at the time of their writing. The AI was literally able to give voice to cultural change indirectly alluded to from within the human-authored reports.

    Conclusion:Humans for Sensemaking & AI For Discovery and Inspiration

    Our experiments found that humans outperform AI in decoding the zeitgeist and defining cultural shifts at large (META Trends, Mega or Macro Trends). We’d call this “sensemaking.” This skill is essentially being able to synthesize wide-ranging, already structured data, and intuitively pattern match and creatively stitch narratives. Humans have an edge over AI when it comes to seeing the big picture and making non-obvious connections.

    Humans can derive META Trend patterns. But as we found out, the AI cannot. Humans bring context to the table: historical knowledge, existing understanding of worthy trend criteria, and most important, ties to business use cases and priorities. Simply, we humans know what to look for. But... This is also our fatal flaw as it translates into bias...

    Meanwhile, we uncovered AI has a distinct advantage when it comes to unifying, processing and analyzing diverse unstructured data sets at scale and with unmatched speed. AI beats humans when finding the most noteworthy weak and emerging signals (aka micro-trends) — concepts undetectable to the human eye due to the sheer volume of data. AI’s superpower in this context is “discovery” and “inspiration.”

    We also learned AI works well when it deconstructs and analyzes both human-structured data (ex. our original META Trends) and massive troves of unstructured data, using them as source material or trailheads in its own search for novelty.

    Ultimately, with insight from AI’s more precise rankings, its detection of signal vs. noise, and its delivered inspiration, it’s undeniable:

    AI is a crucial fixture in a successful cultural intelligence system.

    This series of experiments run by Sarah, NWO.ai and myself demonstrate the need and role of AI and cultural data at scale.

    This is the future of cultural intelligence.

    That’s the clearest takeaway here.

    The optimal cultural intelligence system combines Humans-and-Machines in a series of orchestrated hand-offs, repeating the pattern of construction and deconstruction.

    Humans are best utilized for sourcing and defining large, complex social themes, while AI is best utilized for prioritizing these weighty trends, sourcing micro-trends, and checking humans’ sometimes messy, qualitative approaches.

    Nobody’s perfect.

    And together is better than alone.

    But one question remains: Are there other trends out there unreported by the industry’s published trend reports, unidentified by the META Trend analysis, and undiscovered by the AI.

    Answer: No doubt.

    Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe
  • Earlier this year Sarah DaVanzo and I published the fifth annual 2022 META Trend analysis, a distillation of 40+ industry trend reports to ultimately identify the most frequently reported (i.e. noteworthy) trends for the year. Fourteen (meta) trends were identified to represent what the entire trends industry was collectively forecasting.

    During this time we also announced that for the first year, we’d quantify each of these META Trends to more precisely size and evaluate their cultural influence. It’s been a moment, but we finally crunched the data...

    To get here though, we first had to answer a series of very thorny questions: How do we define the criteria or “borders” of each META Trend, which sources of data should be leveraged to score each, should we study these trends’ influence within a U.S. or global data set, how do we even collect and analyze this data at scale, and how do we complete this exercise without adding any human interference, tipping the scales of objectivity?

    Once answering those could we then shine light on the larger questions at hand:

    * By leveraging cultural data, would AI rank the META Trends differently than how the humans did?

    * Left on its own, could AI identify similar or different (perhaps missing, overlooked) META Trends?

    * And conclusively, what are the best tasks and roles for humans versus AI in order to develop a successfully orchestrated cultural intelligence system?

    While asking ourselves all of these questions, we formed a partnership with the team at NWO.ai. The NWO.ai platform, which was amongst the finalists for LVMH's Innovation award, and an Industry Cloud partner of SAP, was formed in 2020 to identify consumer signals before they become exponential. Its AI algorithms learned over the last 2.5 years, and now boasts statistical trend prediction accuracy. They effectively quantify culture. With Sarah’s previous experience collaborating with them earlier this year, we determined their platform would be the perfect tool for our quantitative META Trend analysis.

    Experiment 01.2022 META Trend Scoring & Ranking_ Humans vs. AI

    As for our first question: If AI was to collect cultural data against each of the META Trends and then score their importance to rank each of them, would the AI ranking match or produce different results from the original human rank?

    Answer: Different.

    As a reminder, our “human ranking” was completed by Sarah and myself manually counting the frequency of similar trend mentions throughout the 40+ industry reports. For example, sustainability trends received the most attention and real estate across the analyzed 2022 reports, and hence the “Eco- Everything” META Trend was born and ranked in the top spot.

    So, to more precisely size and rank these META Trends, NWO.ai’s AI calculated a series of keyword “portfolios” of each trend. These portfolios were essentially groupings of keywords and phrases (i.e. booleans) representing each of the 14 META Trends. Sarah and I authored these portfolios ourselves, but to curb any subjectivity, we only leveraged the language used from the original reports’ descriptions. To be clear, Sarah and I did not forecast these 14 META Trends – these were simply the most talked about concepts throughout the industry.

    NWO.ai then measured the cultural importance of each META Trend via its portfolio of keywords. Consumer interest was quantified using a variety of data sources spanning: social, news publications, search, investments, patents, scientific journals, e-commerce data, and even film scripts. Ultimately, an AI-derived “Impact Score” was calculated by aggregating: volume (quantity of these signals across sources), frequency (volume over day), reach (distribution of publications), etc. These scores were finally normalized on a 0-100 scale to fairly pit each of the META Trends against one another and create our official AI ranking.

    This AI ranking from the cultural data behind each of this year’s 14 META Trends revealed that the convenience economy (Now! Now! NOW!) is dominating culture by a magnitude of more than double that of some other META Trends. In other words, Now! Now! NOW! or the endless demands of innovation surrounding online shopping, has a cultural impact more than 3x the size of the META Trend xX~VIBES~Xx, which is our desire to tune in, drop out, and create spaces or purchase products to fulfill and focus.

    With these AI scores, we then (re-)ranked each META Trend from the human approach.

    This is where things got interesting.

    We had rank discrepancies.

    While the human ranking process (i.e. mention count) identified Eco- Everything as the most prevalent META Trend... according to Eco- Everything’s portfolio of AI scored keywords, it is in fact the 9th most culturally impactful META Trend globally. Meanwhile Now! Now! NOW! received the highest Impact Score from the AI. Originally, it was only identified as the 10th most important META Trend. But according to the AI it is in fact #1.

    Perhaps unfortunately we’re engaging in some wishful thinking here:

    According to millions of cultural data points scraped and analyzed, consumerism beats out industry trend reports’ hype of sustainable innovation.

    Experiment 02.2022 META Trend Scoring & Ranking_ U.S. vs. Global

    When the first differentiation of rankings came back, we were only using a U.S. data set, achieved by geo-fencing our analyzed cultural data to North America.

    We wondered if there would still be discrepancies in the ranking if we created a new, specific global rank by opening up our aperture, sources and data.

    Was there a difference?

    Answer: Not really.

    The META Trend rankings by the AI are by and large the same globally vs. U.S., statistically reinforcing America’s cultural influence. In other words, META Trends’ impact in the U.S. reflects similar impact globally.

    Therefore, this scoring suggests that U.S. trends can be proxies for global trends as they ultimately ripple outwards from the states.

    But more importantly, because we tested the scoring twice (for the U.S. and globally), it reveals that AI data-driven trend scoring and ranking is impressively consistent.

    So to zoom back out, the rank difference between the AI and human methods shows glaring variation. This should make us all think — perhaps even question — the subjectivity and accuracy of the industry’s reported trends. After all, we were scoring what the humans (the most experienced trend forecasters, no less) originally published.

    How valid are these concepts if millions of data points and AI couldn’t mirror our collectively proclaimed importance of them?

    Or conversely, maybe these were in fact the most worthy trends to score and we’re just splitting hairs between the most important of the important.

    Or again on the other hand, just perhaps, there are trends out there with higher Impact Scores just never identified by the experts...

    In any case... The AI scoring revealed that Now! Now! NOW!, Home Hubs and Radical Inclusivity are the top three META Trends of 2022, from both a U.S. and global perspective. This suggests that no matter one’s vertical, these three META Trends are both qualitatively (human-identified) and quantitatively (AI-scored) important.

    Whether you’re national or global, strategically double down in these spaces.

    Experiment 03.AI Deconstructs The Anatomy of Trends_ Identifying Drivers

    In exploring NWO.ai’s platform, we noticed the AI could do something the human process could never — it allowed us to dig deeper and uncover the keywords (i.e. signals, concepts, trends, etc.) beneath the META Trend’s surface. In other words, what is driving each META Trend forward? If you recall, because we originally created portfolios of keywords for each META Trend to score them, we had the opportunity to score each META Trend’s DNA strands to determine which specific elements are having the most influence. Knowing what is driving a trend by ranking its most important components (i.e. keywords) can help us envision how it will evolve over time.

    NWO.ai granted us the ability to plot each META Trend’s portfolio of keywords on a 2x2 matrix and rank them by their current growth, speed, tone and forecast. Essentially answering the question: which explicit keywords are growing or declining in volume, and is this change exponential or momentarily still?

    Conclusion:AI Trend Scoring, Ranking & Driver Identification Is Superior to Humans

    By leveraging AI to score each of the META Trends, not only did we create a more accurate prioritization with a U.S. vs. global nuance — something which humans could never achieve — but the AI also allowed us to reach a granularity and inspect the anatomy of each META Trend. We learned which components are most significantly influencing its growth.

    But from these experiments, we also have a warning:

    An AI data-driven system can prioritize completely different trends than us humans.

    While AI can confidently unlock insight humans can only dream of, its results are so divergent from a human approach that healthy questioning is required: for the emerging software, but also primarily, the human’s “expert” input.

    The AI declared that what we humans thought was most important was not actually the case.

    This begged our next question:

    What does the AI find most important?

    Complete Series:Part I: Using AI To Quantify & Size META TrendsPart II: How To Spot Trends with AIPart III: A_Framework_To: Find Overlooked & De-bias Trends



    This is a public episode. If you’d like to discuss this with other subscribers or get access to bonus episodes, visit zine.kleinkleinklein.com/subscribe