Connect with us

Tech

A Critic of Technology Turns Her Gaze Inward

Published

on

[ad_1]

[ad_1]

In the spring of 1977, when Sherry Turkle was a young professor at the Massachusetts Institute of Technology, Steve Jobs came to visit. While he toured the campus and met with her colleagues, Turkle was cleaning her apartment and worrying over the menu for the dinner she had agreed to host.

It took nearly 50 years, when she was writing her memoir, “The Empathy Diaries,” for her to realize how angry that incident made her. She was at the beginning of her career chronicling how technology influences our lives, yet wasn’t asked to join her colleagues as they spent the day with the co-founder of Apple.

“Why not me?” she said in a video interview last month. It has taken her decades to come to that question, and it reflects her desire to turn the ethnographer’s gaze inward, to examine herself the way she has long studied her subjects. That is central to her new book, she said: “Here is the practical application of what it means to have a conversation with yourself.”

Turkle, 72, is big on conversation. In her 2015 book, “Reclaiming Conversation,” she argues that talking to each other, having an old-fashioned voice-to-voice exchange, is a powerful antidote to life on screens. A licensed clinical psychologist who holds joint doctorates in psychology and sociology from Harvard, she scrutinizes what our relationship with technology reveals about us, about what we feel is missing from our lives, what we fantasize technology can supply.

Her daughter, Rebecca Sherman, said that she and her friends occasionally became the subjects for her mother’s roving inquiries. For example, when is it considered acceptable, while dining out, to look at your phone? It was Sherman, 29, and her friends who explained to Turkle the “rule of three”: As long as at least three other people were engaged in the conversation, it was OK to disappear (temporarily) into a screen.

“The Empathy Diaries,” which Penguin Press is publishing on March 2, traces Turkle’s progression from a working-class Brooklyn childhood to tenured professor at M.I.T. In the first years of her life, she lived in a one-bedroom apartment with her mother, aunt and grandparents. She slept on a cot between her grandparents’ twin beds. Her father was almost entirely absent.

Her family couldn’t afford tickets to High Holy Days at the local synagogue, so they instead dressed up and greeted their neighbors on the temple steps, careful to imply they would be attending services somewhere else. But they recognized Turkle’s intelligence and didn’t ask her to help with the housework, preferring she sat and read. Years later, when she graduated from Radcliffe on scholarship, her grandfather was in attendance.

Turkle also writes about the relationships that shaped her. One of them was with her stepfather, Milton Turkle, whose arrival interrupted Turkle’s early living arrangement and whose name her mother instructed her to take as her own — and never reveal to her classmates or her younger siblings that she had been born the daughter of somebody else. Her own father was rarely spoken of, his very name a taboo.

“I was turned into an outsider, who could see that things were not always what they seemed, because I was not always what I seemed,” Turkle said.

When Turkle first began to publish and achieve recognition, she was asked personal questions, the kind of questions she had asked of her subjects. But she blanched. She was still carrying her mother’s secret, the secret of her real name, years after her mother had died. So when she was in the public eye, she insisted that the personal was off limits, that she would only comment on her work, despite the fact that one of the arguments animating her work is that thought and feeling are inseparable, the work and the person behind the work entwined. She remembers that moment well: shutting down when asked to reveal who she really was.

“That really began my journey and the arc of my beginning that conversation with myself,” she said.

But Turkle has long had an interest in memoirs, and she teaches a class on the subject at M.I.T. She was struck that scientists, engineers and designers often presented their work in purely intellectual terms, when, in conversation, “they’re impassioned by their lives, impassioned by their childhood, impassioned by a stone they found on the beach that got them thinking,” she said. “Everything about my research when I started interviewing scientists showed that their life’s work was lit up by the objects, the people, the relationships, that brought them to their work.”

Part of her motivation for teaching the course, she added, was to prompt her students into seeing their work and lives as connected. And she set out specifically to unite the two strands when she sat down to write her own memoir.

In her book, Turkle describes being denied tenure at M.I.T., a decision she fought and successfully reversed. She can laugh about it now (“What does a good woman have to do to get a job around here?”), but she felt marked by the experience.

Her colleague of nearly 50 years, Kenneth Manning, remembers the episode well. Turkle was “brilliant and creative” he said, but “she was bringing a whole new approach to looking at the computer culture, and she was coming from a psychoanalytic background. People didn’t quite understand that.” When he threw her a party to celebrate her tenure, some colleagues didn’t attend, he said.

Turkle now functions as a kind of “in-house critic,” as she imagines her colleagues might see her, writing about technology and its discontents from within an institution where technology is part of the name. “As her work has become more critical of the digital, there are certainly many elements at M.I.T. who have been dissatisfied with that, of course,” said David Thorburn, a literature professor at M.I.T.

The title of her new book reflects one of Turkle’s preoccupations. As we disappear into our lives onscreen, spending less time in reflective solitude, and less time in real-life conversation with others, empathy, as Turkle sees it, is one of the casualties. The word, which she defines as “the ability not only to put yourself in someone else’s place, but to put yourself in someone else’s problem,” is not only a concern for Turkle, it is a kind of specialty: She has even been called in as a one-woman emergency empathy squad by a school where teachers had noticed that with the proliferation of screens, their students seemed less and less able to put themselves in another point of view.

One of Turkle’s hopes for this particular moment is that the pandemic has afforded us a view of one another’s problems and vulnerabilities in a way we might not have had as much access to before. In the first months of lockdown, Turkle moved her M.I.T. classes onto Zoom. “You could see where everyone lived,” she said. “It opened up a conversation about the disparities in what our situations were. Something that a ‘college experience’ hides.”

In many ways, Turkle believes that the pandemic is a “liminal” time, in the phrasing of the writer and anthropologist Victor Turner, a time in which we are “betwixt and between,” a catastrophe with a built-in opportunity to reinvent. “In these liminal periods are these possibilities for change,” she said. “I think we are living through a time, both in our social lives but also in how we deal with our technology, where we are willing to think of very different ways of behaving.”

Turkle isn’t opposed to technology. She “proudly” watches a lot of TV and loves writing on her extra-small MacBook, the kind they don’t make anymore. But she resists the lure of internet-enabled rabbit holes. “I am so aware of how I am being manipulated by the screen, and I am so uninterested in talking to Alexa and Siri,” she said.

She has spent most of the past year at her house in Provincetown, Mass., and so it is inevitable that Henry David Thoreau comes up. The naturalist and philosopher once famously walked the 25 miles of beach connecting Provincetown to the tip of Cape Cod.

“You know, Thoreau, his big thing wasn’t about being alone,” Turkle said. “His big thing was: I want to live deliberately. I think we have an opportunity with technology to live deliberately.”

[ad_2]

Source link

The post A Critic of Technology Turns Her Gaze Inward appeared first on Latest News & Headlines.

[ad_2]

Source link

Continue Reading

Tech

Whole Foods will soon let customers pay for groceries with palm scan

Published

on

By

[ad_1]

Whole Foods will soon let customers pay for groceries using its parent company’s palm-scanning technology.

Amazon said Wednesday its palm-scanning system — currently used in about a dozen of its brick and mortar stores — will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, the first of many planned rollouts at other locations.

The system uses Amazon One technology, which employs high-tech imaging and algorithms to create and detect a “unique palm signature” based on the ridges, lines and veins in each person’s hand.

Its high-tech sensors don’t require users to touch the scanning surface, like Apple’s fingerprint technology does.

Instead, palm-reading tech uses computer vision and depth geometry to process and identify the shape and size of each hand they scan before charging a credit card on file.

Amazon One will debut at a Whole Foods in Seattle's Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Amazon One will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Shannon Stapleton/Reuters

The company said that the palm-scanning tech will be offered as just one of many payment options at participating Whole Foods Stores and that it won’t impact store employees’ job responsibilities.

“At Whole Foods Market, we’re always looking for new and innovative ways to improve the shopping experience for our customers,” said Arun Rajan, senior vice president of technology and chief technology officer at Whole Foods Market.

Palm images used by Amazon One are encrypted and stored in a “highly secure” cloud, and customers can request to have their palm data deleted.

The company claims palm-scanning tech is more private than other biometric alternatives, such as facial recognition.

Amazon One builds on the “Just Walk Out” technology that Amazon uses in its Go stores, which detects the items shoppers pick up and charges them once they leave — without the need for a checkout line

Amazon is also planning to expand the cashier-less technology to Whole Foods, as reported by The Post.

Meanwhile, the tech could be good for its bottom line. The online behemoth aims to sell its palm-scanning tech to other companies like retailers, stadiums and office buildings.

Amazon One scanner
The scanner uses high-tech imaging and algorithms to create and detect a unique palm signature which is then encrypted and stored in a secured cloud.
Amazon

Last September, it said it was in “active discussions with several potential customers.” But it is unclear if it has progressed on any of those fronts.

[ad_2]

Source link

Continue Reading

Tech

Apple’s new iPad Pros and TV remote don’t have U1 locators to help find them in your couch

Published

on

By

[ad_1]

Apple has been quietly sticking special locator beacon chips into some of its new iPhones that’ll let you unlock your car and find lost items through walls — the latter thanks to the $29 AirTags announced today — but sadly, you won’t find that chip in the new M1-based iPad Pros or the long-awaited new Siri remote for the Apple TV.

Apple confirmed to us that the U1 locator chip, which uses pulses of ultra-wideband (UWB) radio to broadcast its precise location, won’t appear in the Siri remote. We’re waiting on final bulletproof confirmation about the iPad Pros, but it also doesn’t appear in their product page, spec sheet, or press release. Last year’s iP ad Pros didn’t include a U1 chip, either.

Is Apple expecting us to stick AirTags to our iPads and TV remotes to escape the jaws of the ever-ravenous couch? Unlikely, but the company has been pretty choosey about which devices get the chip so far. You can find it in the iPhone 11 and newer (but not the iPhone SE) and the Apple Watch Series 6 (but not the Apple Watch SE), but we’re pretty sure it hasn’t made its way to any iPads or MacBooks that have been announced since the chip’s introduction in September 2019.

Theoretically, Apple could build an ecosystem where any Apple device can easily find any other Apple device (not to mention UWB devices from Samsung, which is also deeply invested in the tech and has its own AirTag-like device as well). But for now, you’ll primarily just be using your phone to find AirTags, not other gadgets, except perhaps your future car.

[ad_2]

Source link

Continue Reading

Tech

Your iPhone has a completely hidden app. Here’s how to find and use it

Published

on

By

[ad_1]

Apple’s iPhone is full of hidden features and tricks we’re constantly discovering. For instance, did you know the Notes app has a hidden document scanner? Yeah, pretty cool. The latest hidden feature that’s been popping up on Twitter and blogs is another type of scanner, dedicated to QR codes, and it’s better than the one built into the camera app.

Indeed, you would already be able to filter QR codes utilizing the easy route in Control Center, or simply open the camera application and it will check a QR code. Also, you’re correct. Both of those strategies turn out great. However, the committed Code Scanner application accepts the position above and beyond by introducing a greater amount of the data I need to see about an examined code.

For instance, the camera application utilizes a little notice at the highest point of the screen to open a connection or show you data, though the devoted Code Scanner application makes it exceptionally clear what’s inside the QR code you just checked. Yet, here’s the rub: The Code Scanner application isn’t found on your home screen, nor is it found in iOS 14’s new App Library.

As should be obvious, the best way to discover the Code Scanner application is to utilize the iPhone’s Spotlight search include. Go to your iPhone’s home screen and swipe down in the center of the screen. An inquiry bar will show up at the highest point of your screen, alongside application and alternate route ideas underneath. Type either code or scanner. As you type, you’ll see the Code Scanner application symbol appear as an application idea. Tap to open it.

The flashlight icon at the bottom of the screen acts as a flash to illuminate a code if your phone is struggling to read it.

If you don’t have the QR scanner shortcut added to Control Center yet, here’s a post showing you how to customize Control Center to your liking. For more hidden features, check out our list for iOS 14. We also cover more general, but useful features in iOS 14.

[ad_2]

Source link

Continue Reading

Trending