Connect with us

Hi, what are you looking for?

Earnings PolicyEarnings Policy

Latest News

Senate pursues action against AI deepfakes in election campaigns

Politicians, like us all, often engage in hyperbole to make a point.

But don’t doubt the alliterative precision of Sen. Richard Blumenthal (D-Conn.) when he warns about “a deluge of deception, disinformation and deepfakes … about to descend on the American public.”

“There is a clear and present danger to our democracy,” he added for emphasis during the Senate Judiciary subcommittee hearing on election deepfakes that he chaired last week.

One thing we don’t need after the Jan. 6, 2021, Capitol insurrection is another danger to democracy. But unlike the televised violence that day, Blumenthal’s hearing showed how artificial intelligence can be used to subvert elections much more covertly than MAGA rioters supporting President Donald Trump attempted to do three years ago.

It’s a bipartisan threat that has generated bipartisan determination.

Two Democrats, Sens. Amy Klobuchar (Minn.) and Chris Coons (Del.), and two Republicans, Sens. Josh Hawley (Mo.) and Susan Collins (Maine), are pushing legislation that would ban deceptive AI materials in political ads. The bill, introduced in September, also would allow federal office seekers to ask U.S. courts to order removal of bogus information and to award compensation to the candidates.

But legislation doesn’t move swiftly, and the urgency is clear, as the hearing emphasized.

“Are we going to have to have an electoral disaster before Congress realizes, ‘Gee, we really should do something to give the public some sense of safety, some sense of certainty that what they’re seeing and hearing is actually real or is it in fact manufactured,’” asked Hawley, the top Republican on the privacy, technology and the law subcommittee, which held last week’s session.

Deepfakes with Trump and President Biden have already been used to fool the public.

Hearing witness David Scanlan, secretary of state in New Hampshire, recalled getting “ready to conduct a really good” presidential primary there the weekend before the vote in January. Then things changed. He started hearing about “a robocall using AI with President Biden’s voice on it, asking individuals not to vote in the election.” The call appeared to come from a phone number associated with a former Democratic Party official.

“It’s important that you save your vote for the November election,” the voice said. “Your vote makes a difference in November, not this Tuesday.”

The message was fake, as was the association with the party official.

“That’s what suppression of voter turnout looks like,” Blumenthal said after playing the audio during the hearing. The Associated Press said it “may be the first known attempt to use artificial intelligence to interfere with a U.S. election.”

Last month, BBC reported on bogus pictures of Trump surrounded by African Americans, apparently circulated to give a false impression about his level of Black support. Last year, deepfake images related to Trump’s court appearances showed him scuffling with police and wearing prison garb.

What’s also disturbing is how little effort it takes to fool people with today’s technology, which makes really good fakes simple. With free online programs, Blumenthal said, “voice cloning, deepfake images and videos are disturbingly easy for anyone to create.”

The one with Biden was done “by a street magician whose previous claim to fame was that he has world records in spoon bending and escaping straitjackets,” Blumenthal added. “And if a street magician can cause this much trouble, imagine what Vladimir Putin or China can do. In fact, they’re doing it.”

Five years ago, The Washington Post reported on a slick Russian campaign that used social media to discourage Black voters, according to documents released by the Senate Intelligence Committee. One poster showed a Black man’s face next to the words “I Won’t Vote.” That was primitive compared with today’s efforts.

While deepfakes involving Biden or Trump will get publicity if discovered, Blumenthal said that “local elections present an even bigger risk” because of the disturbing decline in local journalism, an issue the senator explored in a January hearing.

“When a local newspaper is closed or understaffed, there may be no one doing fact-checking, no one to issue those Pinocchio images and no one to correct the record,” he said last week. “That’s a recipe for toxic and destructive politics.”

Furthermore, a March Government Accountability Office report warned that “trust in real media may be undermined by false claims that real media is a deepfake.” In other words, AI makes it easier for fake news to trump real news.

The problem is growing fast. “Between 2019 and 2020, the number of deepfake online content increased by 900%,” according to the World Economic Forum.

But there are remedies for the toxins AI can generate.

At the hearing, Zohaib Ahmed, CEO and co-founder of Resemble AI in Santa Clara, Calif., urged the “creation of a public database where all generated election content is registered, allowing voters to easily access information about the origin nature of the content that they encounter.” He and others also suggested using digital watermarking technology to verify content authenticity.

Whatever remedies are used, the time is now. Some action is already underway. In February, following a request from Klobuchar and Collins, the bipartisan U.S. Election Assistance Commission voted unanimously to allow federal funds to counter disinformation “amplified by AI technologies.”

But “by the time the deepfake widely spreads, any report calling it a fake is also too late,” said Ben Colman, CEO and co-founder of Reality Defender, a tool that can detect deepfakes. “This is not fearmongering, nor is it AI alarmism, doomerism, or conspiracy-minded hyperbole. It is simply the logical progression of the weaponization of deepfakes.”

He applauded the legislation, the Protect Elections from Deceptive AI Act, but urged more action “by imposing real penalties on bad actors” who “morph reality and on the platforms that fail to stop their spread.”

This is personal for Klobuchar, who spoke about a dishonest Russian photo that indicated she funds Nazis in Ukraine. “This photo had a red circle around me in the background,” she said, “and then they put defund the police signs in the hands of the people at the rally that were never there.”

Klobuchar called for quick action on her bill and a strong approval vote in committee so “we can immediately get this thing heard …

“We really can’t wait.”

This post appeared first on The Washington Post

Enter Your Information Below To Receive Latest News, And Articles.

    Your information is secure and your privacy is protected. By opting in you agree to receive emails from us. Remember that you can opt-out any time, we hate spam too!

    You May Also Like

    Latest News

    FBI Director Christopher A. Wray, who has been increasingly under attack from congressional Republicans, pushed back against his critics in a new interview, saying...

    Economy

    Everything You Need to Know about Tax Saving Deposit Navigating the world of investments can be daunting, especially when looking for options that offer...

    Economy

    USDCHF and USDJPY: USDJPY is testing support at 150.00 The USDCHF pair jumped to 0.91126 levels on Wednesday, forming a new three-week high. The...

    Latest News

    One ripple effect of the Israel-Gaza war is the warp-speed unraveling of relations between President Biden and some of his most loyal voters: Muslims...

    Disclaimer: earningspolicy.com, its managers, its employees, and assigns (collectively “The Company”) do not make any guarantee or warranty about what is advertised above. Information provided by this website is for research purposes only and should not be considered as personalized financial advice. The Company is not affiliated with, nor does it receive compensation from, any specific security. The Company is not registered or licensed by any governing body in any jurisdiction to give investing advice or provide investment recommendation. Any investments recommended here should be taken into consideration only after consulting with your investment advisor and after reviewing the prospectus or financial statements of the company.


    Copyright © 2024 earningspolicy.com