Connect with us

Tech

How The Death of Taylor Force in Israel Echoes Through the Fight Over Online Speech

Published

on

[ad_1]

WASHINGTON — Stuart Force says he found solace on Facebook after his son was stabbed to death in Israel by a member of the militant group Hamas in 2016. He turned to the site to read hundreds of messages offering condolences on his son’s page.

But only a few months later, Mr. Force had decided that Facebook was partly to blame for the death, because the algorithms that power the social network helped spread Hamas’s content. He joined relatives of other terror victims in suing the company, arguing that its algorithms aided the crimes by regularly amplifying posts that encouraged terrorist attacks.

The legal case ended unsuccessfully last year when the Supreme Court declined to take it up. But arguments about the algorithms’ power have reverberated in Washington, where some members of Congress are citing the case in an intense debate about the law that shields tech companies from liability for content posted by users.

At a House hearing on Thursday about the spread of misinformation with the chief executives of Facebook, Twitter and Google, some lawmakers are expected to focus on how the companies’ algorithms are written to generate revenue by surfacing posts that users are inclined to click on and respond to. And some will argue that the law that protects the social networks from liability, Section 230 of the Communications Decency Act, should be changed to hold the companies responsible when their software turns the services from platforms into accomplices for crimes committed offline.

“The last few years have proven that the more outrageous and extremist content social media platforms promote, the more engagement and advertising dollars they rake in,” said Representative Frank Pallone Jr., the chairman of the Energy and Commerce Committee, which will question in the chief executives.

“By now it’s painfully clear that neither the market nor public pressure will stop social media companies from elevating disinformation and extremism, so we have no choice but to legislate, and now it’s a question of how best to do it,” Mr. Pallone, a New Jersey Democrat, added.

Former President Donald J. Trump called for a repeal of Section 230, and President Biden made a similar comment while campaigning for the White House. But a repeal looks increasingly doubtful, with lawmakers focusing on smaller possible changes to the law.

Altering the legal shield to account for the power of the algorithms could reshape the web, because algorithmic sorting, recommendation and distribution are common across social media. The systems decide what links are displayed first in Facebook’s News Feed, which accounts are recommended to users on Instagram and what video is played next on YouTube.

The industry, free-speech activists and other supporters of the legal shield argue that social media’s algorithms are applied equally to posts regardless of the message. They say the algorithms work only because of the content provided by users and are therefore covered by Section 230, which protects sites that host people’s posts, photos and videos.

Courts have agreed. A federal district judge said even a “most generous reading” of the allegations made by Mr. Force “places them squarely within” the immunity granted to platforms under the law.

A spokesman for Facebook declined to comment on the case but pointed to comments from its chief executive, Mark Zuckerberg, supporting some changes to Section 230. Elena Hernandez, a spokeswoman for YouTube, which is owned by Google, said the service had made changes to its “search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations.”

Twitter noted that it had proposed giving users more choice over the algorithms that ranked their timelines.

“Algorithms are fundamental building blocks of internet services, including Twitter,” said Lauren Culbertson, Twitter’s head of U.S. public policy. “Regulation must reflect the reality of how different services operate and content is ranked and amplified, while maximizing competition and balancing safety and free expression.”

Credit…U.S. Military Academy, via Associated Press

Mr. Force’s case began in March 2016 when his son, Taylor Force, 28, was killed by Bashar Masalha while walking to dinner with graduate school classmates in Jaffa, an Israeli port city. Hamas, a Palestinian group, said Mr. Masalha, 22, was a member.

In the ensuing months, Stuart Force and his wife, Robbi, worked to settle their son’s estate and clean out his apartment. That summer, they got a call from an Israeli litigation group, which had a question: Would the Force family be willing to sue Facebook?

After Mr. Force spent some time on a Facebook page belonging to Hamas, the family agreed to sue. The lawsuit fit into a broader effort by the Forces to limit the resources and tools available to Palestinian groups. Mr. Force and his wife allied with lawmakers in Washington to pass legislation restricting aid to the Palestinian Authority, which governs part of the West Bank.

Their lawyers argued in an American court that Facebook gave Hamas “a highly developed and sophisticated algorithm that facilitates Hamas’s ability to reach and engage an audience it could not otherwise reach as effectively.” The lawsuit said Facebook’s algorithms had not only amplified posts but aided Hamas by recommending groups, friends and events to users.

The federal district judge, in New York, ruled against the claims, citing Section 230. The lawyers for the Force family appealed to a three-judge panel of the U.S. Court of Appeals for the Second Circuit, and two of the judges ruled entirely for Facebook. The other, Judge Robert Katzmann, wrote a 35-page dissent to part of the ruling, arguing that Facebook’s algorithmic recommendations shouldn’t be covered by the legal protections.

“Mounting evidence suggests that providers designed their algorithms to drive users toward content and people the users agreed with — and that they have done it too well, nudging susceptible souls ever further down dark paths,” he said.

Late last year, the Supreme Court rejected a call to hear a different case that would have tested the Section 230 shield. In a statement attached to the court’s decision, Justice Clarence Thomas called for the court to consider whether Section 230’s protections had been expanded too far, citing Mr. Force’s lawsuit and Judge Katzmann’s opinion.

Justice Thomas said the court didn’t need to decide in the moment whether to rein in the legal protections. “But in an appropriate case, it behooves us to do so,” he said.

Some lawmakers, lawyers and academics say recognition of the power of social media’s algorithms in determining what people see is long overdue. The platforms usually do not reveal exactly what factors the algorithms use to make decisions and how they are weighed against one another.

“Amplification and automated decision-making systems are creating opportunities for connection that are otherwise not possible,” said Olivier Sylvain, a professor of law at Fordham University, who has made the argument in the context of civil rights. “They’re materially contributing to the content.”

That argument has appeared in a series of lawsuits that contend Facebook should be responsible for discrimination in housing when its platform could target advertisements according to a user’s race. A draft bill produced by Representative Yvette D. Clarke, Democrat of New York, would strip Section 230 immunity from targeted ads that violated civil rights law.

A bill introduced last year by Representatives Tom Malinowski of New Jersey and Anna G. Eshoo of California, both Democrats, would strip Section 230 protections from social media platforms when their algorithms amplified content that violated some antiterrorism and civil rights laws. The news release announcing the bill, which was reintroduced on Wednesday, cited the Force family’s lawsuit against Facebook. Mr. Malinowski said he had been inspired in part by Judge Katzmann’s dissent.

Critics of the legislation say it may violate the First Amendment and, because there are so many algorithms on the web, could sweep up a wider range of services than lawmakers intend. They also say there’s a more fundamental problem: Regulating algorithmic amplification out of existence wouldn’t eliminate the impulses that drive it.

“There’s a thing you kind of can’t get away from,” said Daphne Keller, the director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center, “which is human demand for garbage content.”

[ad_2]

Source link

Continue Reading

Tech

Whole Foods will soon let customers pay for groceries with palm scan

Published

on

By

[ad_1]

Whole Foods will soon let customers pay for groceries using its parent company’s palm-scanning technology.

Amazon said Wednesday its palm-scanning system — currently used in about a dozen of its brick and mortar stores — will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, the first of many planned rollouts at other locations.

The system uses Amazon One technology, which employs high-tech imaging and algorithms to create and detect a “unique palm signature” based on the ridges, lines and veins in each person’s hand.

Its high-tech sensors don’t require users to touch the scanning surface, like Apple’s fingerprint technology does.

Instead, palm-reading tech uses computer vision and depth geometry to process and identify the shape and size of each hand they scan before charging a credit card on file.

Amazon One will debut at a Whole Foods in Seattle's Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Amazon One will debut at a Whole Foods in Seattle’s Capitol Hill neighborhood, with many rollouts at other locations planned for the future.
Shannon Stapleton/Reuters

The company said that the palm-scanning tech will be offered as just one of many payment options at participating Whole Foods Stores and that it won’t impact store employees’ job responsibilities.

“At Whole Foods Market, we’re always looking for new and innovative ways to improve the shopping experience for our customers,” said Arun Rajan, senior vice president of technology and chief technology officer at Whole Foods Market.

Palm images used by Amazon One are encrypted and stored in a “highly secure” cloud, and customers can request to have their palm data deleted.

The company claims palm-scanning tech is more private than other biometric alternatives, such as facial recognition.

Amazon One builds on the “Just Walk Out” technology that Amazon uses in its Go stores, which detects the items shoppers pick up and charges them once they leave — without the need for a checkout line

Amazon is also planning to expand the cashier-less technology to Whole Foods, as reported by The Post.

Meanwhile, the tech could be good for its bottom line. The online behemoth aims to sell its palm-scanning tech to other companies like retailers, stadiums and office buildings.

Amazon One scanner
The scanner uses high-tech imaging and algorithms to create and detect a unique palm signature which is then encrypted and stored in a secured cloud.
Amazon

Last September, it said it was in “active discussions with several potential customers.” But it is unclear if it has progressed on any of those fronts.

[ad_2]

Source link

Continue Reading

Tech

Apple’s new iPad Pros and TV remote don’t have U1 locators to help find them in your couch

Published

on

By

[ad_1]

Apple has been quietly sticking special locator beacon chips into some of its new iPhones that’ll let you unlock your car and find lost items through walls — the latter thanks to the $29 AirTags announced today — but sadly, you won’t find that chip in the new M1-based iPad Pros or the long-awaited new Siri remote for the Apple TV.

Apple confirmed to us that the U1 locator chip, which uses pulses of ultra-wideband (UWB) radio to broadcast its precise location, won’t appear in the Siri remote. We’re waiting on final bulletproof confirmation about the iPad Pros, but it also doesn’t appear in their product page, spec sheet, or press release. Last year’s iP ad Pros didn’t include a U1 chip, either.

Is Apple expecting us to stick AirTags to our iPads and TV remotes to escape the jaws of the ever-ravenous couch? Unlikely, but the company has been pretty choosey about which devices get the chip so far. You can find it in the iPhone 11 and newer (but not the iPhone SE) and the Apple Watch Series 6 (but not the Apple Watch SE), but we’re pretty sure it hasn’t made its way to any iPads or MacBooks that have been announced since the chip’s introduction in September 2019.

Theoretically, Apple could build an ecosystem where any Apple device can easily find any other Apple device (not to mention UWB devices from Samsung, which is also deeply invested in the tech and has its own AirTag-like device as well). But for now, you’ll primarily just be using your phone to find AirTags, not other gadgets, except perhaps your future car.

[ad_2]

Source link

Continue Reading

Tech

Your iPhone has a completely hidden app. Here’s how to find and use it

Published

on

By

[ad_1]

Apple’s iPhone is full of hidden features and tricks we’re constantly discovering. For instance, did you know the Notes app has a hidden document scanner? Yeah, pretty cool. The latest hidden feature that’s been popping up on Twitter and blogs is another type of scanner, dedicated to QR codes, and it’s better than the one built into the camera app.

Indeed, you would already be able to filter QR codes utilizing the easy route in Control Center, or simply open the camera application and it will check a QR code. Also, you’re correct. Both of those strategies turn out great. However, the committed Code Scanner application accepts the position above and beyond by introducing a greater amount of the data I need to see about an examined code.

For instance, the camera application utilizes a little notice at the highest point of the screen to open a connection or show you data, though the devoted Code Scanner application makes it exceptionally clear what’s inside the QR code you just checked. Yet, here’s the rub: The Code Scanner application isn’t found on your home screen, nor is it found in iOS 14’s new App Library.

As should be obvious, the best way to discover the Code Scanner application is to utilize the iPhone’s Spotlight search include. Go to your iPhone’s home screen and swipe down in the center of the screen. An inquiry bar will show up at the highest point of your screen, alongside application and alternate route ideas underneath. Type either code or scanner. As you type, you’ll see the Code Scanner application symbol appear as an application idea. Tap to open it.

The flashlight icon at the bottom of the screen acts as a flash to illuminate a code if your phone is struggling to read it.

If you don’t have the QR scanner shortcut added to Control Center yet, here’s a post showing you how to customize Control Center to your liking. For more hidden features, check out our list for iOS 14. We also cover more general, but useful features in iOS 14.

[ad_2]

Source link

Continue Reading

Trending