Created with Sketch.
A Thousand Things to Talk About
3 minutes | May 10, 2019
3 minutes | May 9, 2019
719: Talkin’ ’bout
3 minutes | May 8, 2019
4 minutes | May 7, 2019
717: Speak Up
2 minutes | May 6, 2019
The lexicon available to us to express delight, excitement, joy, or any number of other emotions, doesn’t always feel quite adequate, at least in the English language. There could be ten thousand available words, but that doesn’t mean one will fit like you need or want it to. Then, in 2016, the Oxford English Dictionary added “squee” to the official list of quote-unquote real words in the English language, defining it as an exclamation used to express great delight or excitement. But, as per usual, that’s not where the story of squee began. Instead, squee started around 1865, when it was used to describe a high-pitched squealing or squeaking sound in the sentence “when, squee, rhepe, twiddle, went the third violin.” Then in the comic Magnus Robot Fighter 4000 AD, the comics used the word “squee” to describe the sound of a robot dying after it’s been killed by the hero. And then, it was in a 1998 Usenet group conversation in that the modern usage of squee has been tracked down to. Someone was chatting about getting something Star Wars in the mail and expressed their happiness and excitement. The reaction we as humans have to seeing something cute is pretty universal, and while the sound and the language we use to express the excitement may change… but a sound of squee seems pretty universal.
3 minutes | May 3, 2019
715: Binge Watch
The McMillian Dictionary Blog wrote, quote: The word ‘binge’ first appeared in English in the mid-1800s to mean ‘to soak’. Around the time of World War I, the term ‘binge’ was used to refer to eating or drinking in excess. The term binge-watching can be traced back as far as 2003, but it didn’t come into common usage until around 2012. Just because we didn’t start hearing the phrase binge-watching until more recently, quoting from a Morning Consult article: Derek Johnson, associate professor of media and culture studies at the University of Wisconsin-Madison, said the concept of binge-watching isn’t new or limited just to streaming services, given that cable channels will stack episodes of the same show next to each other. “There’s potential for binge-viewing in classical, network television schedules,” he said. “Looking at cable channels during the daytime, a huge number of cable channels and electronic programming are blocks of these back-to-back episodes.” In other words, who cares if it’s on Netflix, Hulu, HBO now, or any one of hundreds of other services, channels, or even stored media options — a huge number of individuals have binge-watched something when they have the option. In 2017, Netflix released statistics on the quote-unquote year of binge watching. And those statistics revealed a lot. As Inverse.com reported, quote: Netflix also combed through aggregate user data to find some interesting stats on our collective binge-watching tendencies. Generally, members complete their first binge in only three days. The most popular shows to binge first were Orange is the New Black, Breaking Bad, and The Walking Dead, in that order. Then again, binge watching might be entirely in who you ask. After all, Netlix defines binge watching as completing at least one season of a show within a week. Meanwhile, academic researchers are looking at binge watching as watching two or more consecutive episodes of a show, at least once a week. Taking that second definition, According to a new Morning Consult/Hollywood Reporter poll, 60 percent of adults who watch shows on demand said they binge-watch […] with 15 percent reporting they binge each day, 28 percent several times per week and 17 percent about once per week. So whatever it is you choose to watch, if you make the choice to binge-watch, you’re not alone.
2 minutes | May 2, 2019
714: Emergency Prep
There’s probably about a thousand different angles one could take on personal emergency preparedness — and the short version is, not nearly enough people are, or even feel that they are, prepared for emergencies. There’s a variety of reasons for this, ranging from the fact that preparedness can be expensive to the fact that we tend to view threats that are amorphous and unclear as non-existent. But – what can get very interesting is the fact that individuals aren’t the only ones that need to prepare for emergencies. Governments also prepare for emergencies of various types — and are often asked to be the first responders when emergencies happen. In 2014, researchers dug in to the level of disaster and emergency preparedness in all 27 EU member states. That research utilized the World Health Organization checklist, and found, quote: The average level of disaster management preparedness in the health systems of 27 European Union member states was 68% (Acceptable). The highest level of preparedness was seen in the United Kingdom, Luxemburg, and Lithuania. Considering the elements of disaster management system, the highest level of preparedness score was at health information elements (86%), and the lowest level was for hospitals, and educational elements (54%). In other words, emergency preparedness isn’t necessarily something that every government just magically gets right either. It’s all a matter of doing our best, and for many of us, our best may or may not be ready for whatever emergencies we may face.
3 minutes | May 1, 2019
713: Clean Your Plate
I’ll give fair warning here — if you have any kind of a fraught relationship with food, then following up on any of the research mentioned in today’s episode (or heck, today’s episode in general) could be very triggering. I offer this warning as a fat — and happily so — person myself, and as someone who takes issue with how much obesity is turned into a boogeyman in research. All of that said, there’s a LOT of research out there about how parents, teachers, child-care workers, and the environment around kids helps develop and nurture their relationship to food. And while a lot of the research is tied back to obesity for headlines or in order to describe health outcomes, the research is still valid, often at least, when viewed through the lens of intuitive eating. In a study about parenting practices with adolescents published in a 2013 edition of the journal Pediatrics, researchers found that even into the teen years, parents are much more likely to engage in one of two controlling food related-practices, either pressure-to-eat or artificial food restriction. It doesn’t tend to start in the teen years, however. It starts much younger and the impacts last much longer. As a 2018 paper called “Justifying by Healthifying” found that the clean-your-plate mentality — otherwise called “consumption closure,” creates a mindset where we underestimate the caloric load of a food and overestimate the health value of just about any kind of food if we’re primed to eat most or all of it. As Parents magazine quoted: “If children are encouraged to eat when they’re not hungry often enough, they can lose touch with the signals of hunger and fullness and are more prone to overeat,” says Katja Rowell, M.D., a childhood feeding specialist. “I think many adults have lost touch with their own hunger and fullness cues and don’t trust that children can manage their own eating. It helps to remember that only the child knows how much he is hungry for. Some days he may eat a lot, other days less.” In other words, it’s when we force ourselves – or force kids – to ignore their body’s signals for hunger and fullness, that we start to lose touch with those signals.
4 minutes | Apr 30, 2019
It wasn’t too very long ago that we talked about those who report themselves as having no religion, and the worldwide trends that play into that conversation. One thing we didn’t talk about, however, was faith. Not faith in the religious sense, but faith in the sense of having a very strong belief in, trust in, or confidence in something. Not necessarily a higher power even — just confidence or belief in anything at all. Philosopher Nein Van Leeuwen, in his paper “Religious Credence is not Factual Belief” presents the idea of credences – a mental state or a moral choice to defer to those we see as authorities. If we see someone as trustworthy, then we tend to take on the beliefs they present to us, even if we don’t fully understand those beliefs. This is why often times, those with very strong political beliefs may not be able to tell you down to the dollar or the study the reason for those beliefs, or those that join a cult follow along, when all reason would lead us to a different conclusion. As researchers put it in the book “The Engima of Reason” and it was reported in The New Yorker, quote: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups. And that collaboration itself may be part of the reason we are willing to put faith in something — even when we don’t fully understand it. Quoting from that same New Yorker article: In “The Knowledge Illusion: Why We Never Think Alone” Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins. In other words, our very mental and social existence may come down to faith in many ways. Faith isn’t and doesn’t have to be religious – it just has to rely on trust and confidence. As Peter Thompson wrote in the Guardian, quote: any movement that seeks social change and improvement is a faith-based one. It has to be, otherwise there would be no reason to hope for something better. The more economy and society changes its moral and ethical ground, the more there will be a desire to bind us back in to the old certainties. There is probably no way of effectively mobilising people against […] if they don’t believe in something, no matter how abstract or apparently bizarre it may seem to others.
3 minutes | Apr 29, 2019
711: Too Hard On Yourself
In 2013, two researchers published in the Journal of Psychology integration a review of the research and discussion of the phenomena of self-criticism. That particular paper defined self-criticism as, quote: a conscious evaluation of one-self that can be a healthy and reﬂexive behavior, but also can have harmful effects and consequences for an individual. In other words, reflecting upon your thoughts, beliefs, words and actions can be very useful. But like all things that get a bit overwhelming in too heavy of a dose, self-criticism tends to tip the scale into a harmful realm when it becomes harmful. Generally, the research on individuals who are very self-critical and/or perfectionist tends to treat those particular behaviors as associated behaviors – or even symptoms of – depression and anxiety. But several researchers are starting to dig in to the idea that perhaps, rather than a symptom alone, that even neurotypical individuals may have topics or areas where they are very self-critical in ways that become damaging. And while it can be tempting to think that being hard on yourself is part of what drives success, it’s worth paying attention to the conclusion of the 2011 article “The Effects of Self-Criticism and Self-Oriented Perfectionism on Goal Pursuit:” Although self-criticism has previously been shown to be related to diminished goal progress, a controversy remains regarding the potential association between aspects of “positive perfectionism,” such as self-oriented perfectionism, and enhanced goal progress. The results of the five studies demonstrated a consistent pattern of negative association between self-criticism and goal progress.
2 minutes | Apr 26, 2019
Leadership is one of those topics that seems to come up time and time again. Back in episode 486, we examined what it means to be a supervisor, and the fact that only one-third or so of people aspire to any kind of leadership at all. Even with that, however, it’s often true that those who become leaders may or may not have wanted to be named that, or are even officially in leadership positions. Once you are in a leadership position – if you want to be or not – there is no dearth of options of what kind of leader you might be, number of books to read about leadership, or even entire degrees of every level you can get in leadership studies. The HubSpot blog outlines seven different types of leadership styles, ranging from democratic to autocratic, strategic to transactional. Meanwhile, the Association for Talent Development instead breaks leadership up into five styles – managerial, relational, motivational, inspirational, and transformational. Blog WiseToast goes even further, with 12 different types of leadership outlined and The Executive Connection outlines nine common leadership styles. While there’s crossover in all of these, there’s also a drive here similar to ones we have talked about in the past – a drive to categorize and organize the experiences we have, ways we interact with others, and ways we think about ourselves in a way that may be useful to us. The power of that kind of examination, however, comes not in the fact that we can name our leadership style (official or not) as one thing or the other — the power comes in using that information, feedback, and framework as a jumping off point for determining how we want to interact with that world.
3 minutes | Apr 25, 2019
There’s a variety of things that could be called “allergies,” and often are called allergies by those who experience reactions to various types of foods. Medically, an allergy is when there’s an immunological reaction to a particular food that causes hives, anaphylaxis, itching, sneezing, runny eyes, and the like. However, there are a number of reactions that individuals will often refer to as an allergy. Gastrointestinal distress, for example, of the type that lactose-intolerant individuals may experience when consuming milk. The World Allergy Organization reports that, quote: It is generally accepted that food allergy affects approximately 2.5% of the general population, but the spread of prevalence data is wide, ranging from 1% to 10% In 2008, the Centers for Disease Control in the United States reported that, quote: From 1997 to 2007, the prevalence of reported food allergy increased 18% among children under age 18 years. Eight types of food account for over 90% of allergic reactions in affected individuals: milk, eggs, peanuts, tree nuts, fish, shellfish, soy, and wheat. In 2007, a study carried out in Poland took a look at the prevalence of children’s allergies in urban and rural environments found that in general, children who live in urban environments tend to have a higher prevalence of allergies while those in rural environments do not tend to. As the Journal of the American Medical Association Network highlights, quote: food allergy prevalence estimates from these recent national surveys exceed 9% of US adults, suggesting that food allergy may affect more US adults than previously acknowledged. Although some children with food allergy develop natural tolerance, others retain their food allergy as they enter adulthood. Adults can also develop new food allergies, and evidence suggests that certain food allergies (eg, shellfish and fin fish) may be more likely than others to develop during adulthood. In other words, allergies aren’t really all that uncommon, and the things that we tend to refer to as allergies even more so. The prevalence seems to be growing, too, but even if that wasn’t the case, you or someone you know likely has an allergy to something.
3 minutes | Apr 24, 2019
708: Long Nights
The circadian rhythms that govern how we sleep are impacted by many of the things that around us, the things that we experience, and the environment that we live in. That environment includes the light and dark cycles around us – both natural and electric. There have been a variety of studies done throughout the years on the impact that the natural rhythms of night and day have on human bodies. There was one particularly interesting study done in 2012 on the impact of long winter days on the individuals working in the arctic. That study found that, when deprived of light for long periods of time, many individuals maintained some semblance of a 24 hour light cycle, though some do end up defaulting to cycles that are more or less than 24 hours. While it’s easy to think that conditions such as Seasonal Affective Disorder are simply a function of latitude, that may not be the case. Quoting from a 2005 overview of Seasonal Affective Disorder : An estimated 10 to 20 percent of recurrent depression cases follow a seasonal pattern.3 Although a summer pattern of recurrence is possible, the predominant pattern involves fall/winter depression with spring/summer remission. In U.S. community surveys, SAD prevalence ranges from 9.7 percent in New Hampshire to 1.4 percent in Florida.4 In North America, SAD prevalence increases with latitude, but the correlation is nonsignificant in other parts of the world. In fact, it may be that the impact late nights and long nights have on our psychology is partially a function of the mindset we have going in. Quoting from Science Nordic: Leibowitz and Vittersø found a correlation between how far north people lived in Norway and how positively they viewed winter and winter darkness. More importantly, “Vittersø and I found that positive thought patterns in winter were related to life satisfaction, positive emotions and seeking challenges that can lead to personal growth,” Leibowitz, an American psychology student, said. Vittersø emphasizes that the study has not been published in a peer-reviewed journal, and that the correlations they found do not mean causation. Nevertheless, he says that the study suggests what might make life easier for Norwegians during a time of year that most people don’t really like. “We who live in Norway are probably not aware that we see so much to enjoy during the winter, because it is so normal for us,” Vittersø said. “But it is probably much less common to look at winter this way elsewhere in the world.”
4 minutes | Apr 23, 2019
Self-care is a whole lot more than a throwaway phrase… or a multi-billion dollar industry. As NPR put it in 2017, quote: Self-care existed long before millennials did. Ancient Greeks saw it as a way to make people more honest citizens who were more likely to care for others. In her 1988 book, A Burst of Light, Audre Lorde wrote that “caring for myself is not self-indulgence, it is self-preservation and that is an act of political warfare.” In 2015, according to the Pew Research Center, more millennials reported making personal improvement commitments than any generation before them. They spend twice as much as boomers on self-care essentials such as workout regimens, diet plans, life coaching, therapy and apps to improve their personal well-being. They’ve even created self-care Twitter bots. No matter how many financial resources you may or may not have at your disposal, there seems to be a question of what may or may not be considered a luxury. After all, luxury has a number of definitions, many of which are based on the kind of world we live in. As Vice reported, quote: “We live in a society where structures are extraordinarily powerful in defining who people are and what will happen in their lives,” Carl Cederström, an associate professor of organization studies at Stockholm University said. “Yet at the same time, we seem to live in a time where we refuse to see that that is the case. We really do want to believe in the American dream or in the fantasy that everything could be solved through individual measures, through self-care, through techniques of magical thinking, the power of positive thinking, or whatever it might be.This is like when Paul Ryan suggested that poor people see a life coach as a requirement to receive federal aid, in order to “design a customized life plan to provide a structured roadmap out of poverty.” The sad irony is that much of the history of popularization of self-care comes from activists as a reaction to institutional shortcomings—women, people of color, and the LGBTQ communities. Self-care was a kind of protest. It was a way to look after marginalized bodies and minds when no one else bothered to.” So how do you manage to treat yourself to luxuries, especially when your position financially or socially may not allow for what general society considers a luxury? There was a 2018 survey done by a hotel chain asking Brits to name what they considered to be luxuries, and many of them were the type that require time more than money. A few of the options? Reading a bookCatching your favorite movieSpending five minutes with yourselfGoing for a picnicSo how about you?
3 minutes | Apr 22, 2019
706: Personal Style
Though the phrase “style switching” is generally used to refer to how someone chooses and then alters their words, phrasing, and cadence of speech, there’s a number of other situations where that same type of shifting applies. It’s important to note, too, that fashion and style are two different things. Quoting from fashion blogger Sarah Scoop: Personal style is something different entirely. It’s the style you cultivate for yourself and – most importantly – it doesn’t move with fashion. Sure, you can add and change things if you find something on trend that you particularly love, but your key looks at the same. […] and no fashion whim is going to deter you from them. Style switching — changing or altering the fashion that you choose when it comes to your personal clothing style can have just as much of an impact on the view others have of you. As seamstress Burke Brewer put in on her blog, quote: […] our clothing tells the world who, and where we are in life. Regardless of why you wear what you wear, I believe in wearing clothing that I love, that makes me feel fabulous & invincible, and that celebrates the uniqueness of my body with all its curves and flaws. I believe in wearing clothing that genuinely feels like me. My style will inevitably shift again — when I become a mother or age gracefully — and I’m excited to see what I’m creating 10, 20 and 30 years from now. I look forward to looking back to see where life, and my style, takes me. In fact, that style and the choice of what clothes you put on can be more than just a choice. In fact, it can be a powerful and empowering moment to choose the clothing that you present yourself with. As Tom Rasmussen wrote for CNN about trans and non-binary folks, quote: For some of us, fashion allows us to pass, and to remove demarcations of a gender assigned to us. For others, it’s a way to opt out, to ask questions, to curate a whole new gender away from a binary. For all of us, it gives us autonomy over our bodies. The clothes we wear allow us to make a choice, they allow us to escape the expectation of normativity, and the boredom of homogeneity.
3 minutes | Apr 12, 2019
As with many things related to how we define ourselves, the religious traditions we grew up in and around have a big impact on the religious choices we make as adults — and the religious choices we don’t make. Stepping aside all of the discussion of if particular religions are good or bad for the world, or for individual people, even the numbers themselves are particularly interesting. The Pew Research Center has had a long-ongoing study of both nation-specific and worldwide religious trends, and while you can dig through the UNData demographics statistics database, I would suggest setting aside an entire afternoon for that – at the minimum. Worldwide, religion is still going strong – very strong. In 2015, those reporting no religious affiliation — which includes those that are atheists, yes, but also includes those that consider themselves spiritual or religious, without any specific affiliation to an organized religion — accounted for 16% of the world population. In that same year, Christianity accounted for just over 31 percent of world population, Muslims 24%, and Hindus 15%. Taking into account birth rates, conversion patterns, and other available data, the Pew Research Center estimates that somewhere between 2030 and 2035, Muslims will overtake Christians as the world’s most prominent religious affiliation. Though this is the worldwide pattern, it certainly is not the pattern in individual countries. In the US and in much of Europe, those reporting no religious affiliation at all has been steadily — though very very slowly increasing, In the US, it is around 23 percent of adults who are religiously unaffiliated. In the UK, the numbers are even more stark – in 2017, 53 percent of the residents of the UK reported themselves as having no religious affiliation, which went up by a full five percent from a similar survey just two years before. Then again, none of these surveys have any kind of box to tick for the phrase “spiritual but not religious” or “not religious, but spiritually practicing,” which is a phrase being used more often to differentiate from this historically synonymous terms.
5 minutes | Apr 11, 2019
704: Organized Activity
There’s almost no debate, though there is research aplenty: extracurricular or organized activities tend to be a net benefit to kids. From learning teamwork and social functioning to working through challenges to trying out new interests, doing organized activities outside of the school day is generally considered to be a good thing. That doesn’t necessarily mean that it’s an equal opportunity thing, though – country-to-country or within the United States. Quoting from an Atlantic article that explored the economic differences of extracurricular participation in the United States: While there’s always been a gap in access to extracurriculars, participation numbers for the two groups increased at about the same rate until they started to diverge precipitously—in the early 1980s for non-athletic activities and in the early 1990s for sports teams. In 1972, roughly 61 percent of low-income high school seniors, and 67 percent of their more-affluent peers, participated in one more more non-athletic extracurricular activities. A decade later, participation rates rose to about 65 percent and 73 percent, respectively. But by 1992, while 75 percent of upper- and middle-class seniors reported participating in extracurriculars, involvement among disadvantaged students dropped back to 61 percent. By 2004, the number for low-income seniors was down to 56 percent. These numbers are echoed by the Pew Social Trends report of 2015 looking at parenting in the United States, which found that, quote: Parents with higher income and education are more likely to report that their children participate in various extracurricular activities. Among parents with an annual family income of $75,000 or higher, 84% say their children participated in sports or athletic activities in the 12 months prior to the survey; 62% say their children took music, dance or art lessons. By contrast, some 59% of parents with annual incomes of less than $30,000 say their children participated in sports, and 41% say their children took lessons in music, dance or art over that period. This is not something that happens just in the United States, though. One of those hoping-for-the-clicks surveys, this time that came out of the UK, Over a quarter (28%) of parents in the UK have been put into financial difficulty funding their child’s extracurricular activities, such as sport and music classes. The study revealed that nearly a third of children (31%) take part in three or more extracurricular activities at school a week, with the average UK parent spending £237.32 a year funding them. However, a fifth of parents spend more than £300 every academic year on their child’s after- school activities, with a further 10% spending more than £500. In other words, organized activities can be an expensive add-on to a child’s education — and in more ways than just money. In the blog Tellement French, a working mother who has lived both in the US and France draws a comparison of the extracurriculars in both countries. Quote: The practice of activities in the US is very time-consuming: every musical or sports activity requires at least 2 days of practice per week. In France the Conservatory asked us several weekly participations but the sports activities were limited to one participation a week (or sometimes 2 when there were games). That way kids can practice different activities as hobbies unless they plan to become professional in an artistic or sportive field. And the truth is, many of us do not end up becoming professionals in the things we did as kids. So…
3 minutes | Apr 10, 2019
Avoidance – actively running away from or mentally staying away from something that’s in our world – is a lot more than just doing dishes and cleaning the bathroom to ot do our taxes. Avoidance is a recognized in psychological research, and has even — no shocker — been categorized. In the book “Mind and Emotions: A Universal Treatment for Emotional Disorders,” the authors break down five specific types of avoidance. They are, quote: 1- Situational avoidance, where someone just completely avoids a situation that they are concerned may cause them stress, panic, or anxiety. This can also count for particularly avoiding a trigger, such as dogs for those that have a fear of dogs. 2- Cognitive avoidance, where someone takes actions to suppress or reject certain experiences, thoughts, feelings, or emotions. This type of avoidance often comes in to play when someone does not want to process or confront something, so instead suppresses or replaces it with something they find more pleasant. 3 – Protective avoidance, or taking actions to protect one’s self. While there are healthy — and important — ways to protect yourself, this is also the kind of avoidance that can manifest in obsessive compulsive disorder. 4 – Somatic avoidance, which happens when someone attempts to limit or tamp down the physical responses associated with emotions, such as tightness in the chest or getting out of breath. The idea is by suppressing the physical symptoms, the emotional reactions are experienced less. 5 – Substitution avoidance – straight-up trying to replac one feeling with another that feels more tolerable. The thing is, all of these mechanisms of avoidance can be very useful coping mechanisms – but as with all coping mechanisms, there is a time that the mechanism may not serve your emotional health. As a 2011 study of breast cancer patients published in the Journal of Health Psychology found, breast cancer patients who actively avoided dealing with and processing the topic of their cancer showed higher levels of depression, stress, and anxiety. So what topic are you avoiding right now?
3 minutes | Apr 9, 2019
702: Yay Failure
Today’s question: What “failure” in your life has turned out well? There are millions of ways to fail, and only one to succeed. Success is often achieved by those who don’t know that failure is inevitable Giving up is the only sure way to fail There is no failure except in no longer trying… And that’s just the very beginning of a very, very long — nigh on endless — list of quotes about failure. Almost all of them from individuals that would be viewed as “successful” in many measures — money, fame, or professional quote-unquote success. The interesting thing is, what we often don’t spend a ton of time talking about is failure. This isn’t just my personal feeling — it’s something backed by research. In the 2016 research paper “Even Einstein Struggled: Effects of Learning about Great Scientists’ Struggles on High School Students’ Motivation to Learn Science” the authors discovered that when students were let in on the fact that Einstein, Marie Curie, and other well-known scientists struggled in their own work, the students were much more willing to work through what they perceived as failures and continue in science education. And that finding – and the lack of other research to back it up – is what prompted one of the authors to get to work creating the Teachers College at Columbia University EPIC — Education for Persistence and Innovation Center. In their own words, EPIC is, quote: dedicated to studying the critical role that failure plays as a catalyst for learning, innovation, leadership and career development. EPIC is founded on the premise that success is often the fruit of persistence in the face of failure and adversity, which in turn motivates individuals to become creative problem solvers on the road to success. EPIC is dedicated to deepening our understanding of failure In other words, studying how failure itself may actually, ironically, be what is absolutely necessary for success. It’s all a matter of how you define it.
3 minutes | Apr 8, 2019
701: Favorite TV
Today’s question: What was your favorite tv show growing up? The 80s and 90s are haunting us. Everything from the comic books that kids of those years read to the TV shows that hundreds of thousands of people watched are being constantly resurrected, re-worked, or re-built. For shows that aren’t being reworked for the modern world, there are also shows being set in those worlds. The favorite TV shows we have growing up are the favorite TV shows being created again, much to the viewership joy of advertisers and both joy and frustration of fans who loved those shows growing up. And this isn’t a new phenomena. In the 70s, shows like Happy Days tried to re-create the shows and feeling of the 1950s, Our favorite shows are, in many ways, coming back to haunt us. But they may be more literally haunting us than we realize. In an article for Buzzfeed News, Sara Tatyana Bernstein writes, quote: Nostalgia is only one way of looking at history; it functions, essentially, to make the past a safe place to play. The ’80s are over, so GLOW can tell stories about sexual harassment and the danger of stereotypes, and Stranger Things can deliver horror stories about government conspiracies. These stories are disturbing in the present, but nostalgia provides a sort of protective shield between now and then. In other words, even if we’re using these shows to work out contemporary issues, a nostalgic story requires the past to stay put. But revivals in which a show’s original characters are unearthed to exist in the present are inherently unsettling; they are hauntings. And because of that, they can offer us a way to see how we got from There to Here — and maybe even help us deal with the consequences of our history. Nostalgia also enables us to rewrite the past so it looks more like we wish it had been. So be it She-ra or Pokemon, Rosanne or Full House, the shows that were our favorites growing up may just be a useful lens through which to look at our current experience. After all, it is entirely possible to both acknowledge that something is problematic, and enjoy it. Especially when we recognize that we’re doing exactly that.
Terms of Service
Do Not Sell My Personal Information
© Stitcher 2021