Following up on 9/29 article: Why you shouldn’t sleep in the same room as your phone

Commentary

Elizabeth McLaughlin, Editor

Synced

Attention is a powerful tool: one that enhances our human experiences, and one that is highly prized and captured by technology and media.

It has been nearly four months since I wrote the article entitled “Why you shouldn’t sleep in the same room as your phone.” After re-reading it this afternoon, I couldn’t help but notice ways in which I could improve upon it. I also noticed that my relationship with my phone has changed a bit since Sept. 29, the day that article was published. So without further ado, here’s the follow-up that no one asked for.

Full disclosure: I sleep with my phone in my room again. One night, as we were all wrapping up whatever conversation filled our living room that evening, my roommate Mia made a comment to my other roommate, Ren, that perhaps leaving her laptop downstairs overnight wasn’t the best idea in the event of a home invasion. It sucks that we have to think that way, but this advice was coming off the heels of a string of home invasions on our block committed by a man who lives just three doors down from us. Safeguarding ourselves and our belongings was a priority situated at the front of our minds. I plugged my phone into its usual spot next to our TV and went upstairs, not thinking much of Mia’s comment… until I started perseverating on it.

“If someone breaks in and steals my phone, they wouldn’t be stealing just my phone — I keep all of my cards in a wallet attached to my phone case. My debit card, my other debit card, my other debit card, my expired debit card, my credit card, my ID, my school ID, my expired ID. They would have it all. And I would have to re-obtain all those elements of my identity.” Talk about a headache! This was the opposite of my intention when I decided to sleep in a phone-less room, so needless to say, that night was the last night I heeded my own advice.

But that night launched a series of moments with myself in which I began to evaluate exactly how much of my identity is tethered to a device. My copious amount of cards aside, my phone is also a portal into the various versions of myself that I choose to share with others. I try to limit my social media to just Instagram these days, but even there, I have two accounts: a personal one and an art one. To me, there is not much delineation between what I might share on the former versus the latter; they represent the same person, just with different photos and captions. But my personal lack of boundaries between the two doesn’t matter much; by making both accounts, I chose to fragment my identity, creating two canalized versions of the one person I know myself to be. And that’s a little unnerving.

I fell down a rabbit hole of making a mental note of all the online avatars I’ve created for myself over my 21 years of life. It all started in late elementary school, early middle school when I created way too many One Direction fan accounts. And in 2012, I created my Facebook account to connect with relatives who lived in other states and also to play Farmville. Somewhere around then, I made multiple Tumblr accounts: for writing and One Direction, mainly. Thanks to the strong community that Directioners so famously fostered, I made virtual friends all over; I even inherited a meme account from one of those friends, who disposed of it and all of its ten thousand followers as easily as one would dispose of a used tissue.

I was very present online, because that’s what you do when you’re a teenager in the 21st century. Even now, I can remember with great detail the types of environments I was exposed to from platform to platform. My One Direction fan accounts are where I was first exposed to digital art; I remember becoming good friends with a Brazilian girl named Paula whose digital paintings of Niall Horan still impress me to this day. On my meme page, I exchanged units of cultural ideas and symbols with tens of thousands of people across the world (after all, that’s what the word “meme” literally means: a unit of cultural information spread by imitation, as defined by Richard Dawkins in his 1976 book “The Selfish Gene”). On my Tumblr accounts, I had access to a range of writers; the ones I found most interesting were the other 16-year-old girls who just so happened to be situated somewhere else on the planet. On a darker note, I was exposed to the nasty eating disorder environment that was all too familiar to girls like me in that era.

I had seen and experienced so much thanks to the Internet. And looking back on it, I can’t help but feel protective of my younger self; of young girls now who are experiencing their own digital renaissance as I type these words. And my concern isn’t reserved for teenage girls; it extends to all of us who regularly interact with the Internet. I’m realizing more and more with each passing day how much of who we are, individually and collectively, is informed by the ways technology captures our attention. Four months out from my initial article and almost two years into a global pandemic, I’m constantly taking note of how precious our attention is — and how sophisticated, calculated and well-funded the various attacks so often made on it are.

The word “attention” comes from Latin ad + tendere, meaning “to stretch toward.” I think it’s important to make the distinction that when we pay attention to something, the word itself does not describe a “bringing forth,” but instead a “stretching toward.” It’s as if we meet the object of our attention where it is, perceive it and then move onto the next object once we’ve had our fill. If it were the case that the act of paying attention is a bringing forth of sorts, then wouldn’t there be a loss in understanding by moving the object away from its original space in time? I like to think of an analogy of seeing the Eiffel Tower in pictures on our phone as compared to going to Paris and seeing it with our own eyes; the former eliminates much of the richness and enjoyment found in “stretching toward” in favor of the ease of “bringing forth.” Attention is a beautiful thing, I’m learning, precisely because it invites us to stretch ourselves toward something else; to step outside the perceived boundaries of the self to attempt to meet something (or someone else) where it is.

I’m also learning that the way media is presented to us is sometimes more compatible with a “bringing forth” model than a “stretching toward” model. We don’t have to go digging to find something that will capture our attention. In fact, we often do the opposite of digging: we just peruse the surface until we’ve had our fill. (If you don’t believe me, just consider the way a Twitter homepage is designed. The man who invented the “pull to refresh” mechanism, Loren Brichter, has since expressed great remorse for designing something that is so adept at holding our attention prisoner.) I’m afraid that we engage too often with too much surface-level attention that fails to satiate our curiosity. In other words, it’s easier to let ourselves be presented with information than it is to actively seek out what we want to know. I’ve realized that this whole critique I’ve just made is why I’m so against Tik Tok. The idea that an algorithm brings forth content for us which we then find worthy of our attention — to a frighteningly accurate degree, I might add — feels more than defeating; it feels uncreative.

It feels like the tools that enable us to fully experience what it means to be human are becoming dull. We have too many apps and instances where we can let others (people, algorithms) bring forth objects worthy of our attention; too infrequently do we actually stretch ourselves toward something else. I can’t blame us; actively shifting our attention toward x is harder than being passively presented with x, and it’s far too easy to dismiss x with a simple flick of our fingers. This is all to say, I prefer to spend time on the Internet actively seeking out things that interest me than spend time being a receptacle for whatever the algorithms have identified as worthy of my attention at that moment. In other words, I’d rather be (more) in charge of what I pay attention to than let my attention be channeled and canalized by external forces.

Harry Frankfurt is a philosopher who was born in 1929 in Langhorne, PA. I mention his birthplace because when I discovered it via a quick Wikipedia search, I was delighted to learn that he grew up just 30 minutes from me. I think it’s important that we supplement our interests with context; the context that this philosopher and I share in a geographic locale has made his impact on me all the more potent. I definitely need to dive deeper into his work on free will and the concept of a person, but there is one idea that I want to work into this article before I sign off: the idea of wanting to want what we want.

Say it’s Sunday morning and what I want is to scroll on social media, and I know that I want to do that because I wake up and reach for my phone. Is that what I want to be doing? Sometimes, yes. More often, no. So when I don’t want to want to do that, I don’t do it. At the risk of typing “want” way too many times, I’ll leave off with this: I want to enjoy what I do as I do it; I want what I do to add to the fullness of being human. So I have conversations like these with myself and others, consciously considering attention, both at the individual and collective levels. After all, what we pay attention to is what we make our lives out of; I don’t know about you, but I want to make a life I love.

For more on the topic of attention, and to find out where I learned a lot of the facts I used in this article, check out Jenny Odell’s book, “How to do Nothing: Resisting the Attention Economy.”

The iPad is harming our kids and they have no idea

Commentary

Elizabeth McLaughlin, Staff

I am twenty years old and still learning how to use technology. Sure, I’m pretty adept at all the basics: Microsoft Office, Adobe Creative Cloud, etc. I can even create (mediocre) animations on digital art software like Procreate. But I’m still — and likely always will be — learning how technology fits into my everyday life. I’m learning how to have a relationship with technology. I’m old enough to remember a time when social media didn’t exist. But I’m also not old enough to recall the days when computers weren’t woven into the very fabric of our daily lives. I’m positioned to have a relationship with technology that demands almost complete reliance, but not utter and perpetual immersion.

But the generations below me and my peers have been living in the heyday of social media since birth. My mom is a nanny and she has been with the same child for his entire life — seven years. I’ve gotten to see firsthand his introduction (at a very young age) to the iPad — and all the problems that ensued. I’ll preface this article by saying that I recognize the importance of and need for technology in our daily lives; I’m not longing for some pre-digital return to nature because I know it’s not possible. I’m simply noticing and reporting the detriments of constant exposure to technology.

The issues with childhood engagement with technology can be viewed from multiple angles. For one, it can interfere with their basic human functions, such as sleep. “Electronic stimulation has been shown to interfere with both falling and staying asleep,” according to Northwestern University’s parenting expert Katherine Lee. In fact, a study published by Pediatrics “found that children who sleep near a smartphone or another small-screen device get less sleep than kids who are not allowed to have these types of devices in their bedrooms.” If you’ve ever babysat a child, you know how hard it can be to get them to turn off the iPad. And if you’re trying to get them to turn it off because it’s bedtime? Good luck.

Jamie Grill/Getty Images

The average child is saturated with screen time every single day; their reliance on these devices has detrimental effects.

So it’s clear that the simple act of sitting in front of a screen can cause issues. But what about what’s on the screen? Any time anyone engages with technology, there are two parties involved: on one side of the screen, there’s a supercomputer pointed at your brain, trying to figure out the perfect next thing to show you. On the other side of the screen is our prefrontal cortex, which evolved millions of years ago to do its best job at goal articulation, goal retention, staying on task and self-discipline. This is true for all of us who have brains. But imagine it’s not you, it’s a seven-year-old, or a five-year-old, or even a two-year-old. At that age, I was playing with Lincoln Logs and American Girl Dolls — neither of which had supercomputer powers.

As a twenty-year-old, I’ve had time to learn how to differentiate information that benefits me from information that harms me. But kids haven’t had that time yet. In 2015, Aaron Mackey was a graduate fellow at Georgetown University Law Center’s Institute for Public Representation. He was part of a coalition that included the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy. The coalition alleged that “it’s deceptive to tell kids that this [YouTube] is a safe product… Anyone, with just a little bit of searching, can find a lot of inappropriate content.”

Moreover, the infinite possibilities offered by YouTube can cause addiction. According to the American Addiction Center, those with higher addiction risk are unable to self-regulate, impulsive and lack a strong sense of moderation. YouTube’s recommendation algorithm doesn’t want you to self-regulate; it wants to regulate for you. Its algorithm wants you to impulsively click on whatever catches your eye because that’s how it builds a profile on you. Its algorithm rejects moderation in favor of saturation. And yet, we’re placing it in the tiny hands of our precious kids. Sleep issues, inappropriate content and addiction — the gamut doesn’t stop there; these are just three angles from which one can view the greater problem of technological saturation. Kids aren’t the only victims; everyone who engages with technology is susceptible to these ailments. But kids are the most unaware. They have no idea what any of the iPad implications are, but they are sincerely affected by them. They’re just kids and it’s our job to care for them. After having only scratched the surface on this topic, I’m questioning whether giving our kids iPads is a good idea. In fact, I know it’s a terrible idea — but we’re going to keep doing it. So how can we ensure that our kids don’t ruin their sleep schedule and develop an addictive personality while viewing inappropriate content online? I’m only twenty years old and I don’t plan on having kids for a while, but I sincerely hope I’m able to answer that question by then.