Even Before Pandemic - UNLV NewsCenter

Image
Even Before Pandemic - UNLV NewsCenter Even Before Pandemic - UNLV NewsCenter Posted: 05 Oct 2020 12:00 AM PDT Even before the coronavirus pandemic propelled UNLV into remote learning in the spring, online courses at UNLV were prevalent.  "There's been a steady decrease in the number of students that have never taken an online course," said Elizabeth Barrie, the director of the Office of Online Education . She recently presented during The State of Online Education webinar event. It highlighted some of the initiatives and cross-campus partnerships that contribute to student achievement and shared how faculty prepared for online learning through the summer. She noted that 95% of students who graduated in spring 2020 with an undergraduate degree had taken at least one online course. And, compared to past years, there has been an increase in the number of students who have taken more than 30 credits, or two semesters, online. 

Privacy Expert, 2019 'Genius Grant' Winner Joins UVA Law Faculty - UVA Today

Privacy Expert, 2019 'Genius Grant' Winner Joins UVA Law Faculty - UVA Today


Privacy Expert, 2019 'Genius Grant' Winner Joins UVA Law Faculty - UVA Today

Posted: 11 Feb 2021 12:16 PM PST

Danielle Citron, a pioneering law professor in the area of digital privacy who helped now-Vice President Kamala Harris in her effort to combat nonconsensual pornography, is joining the University of Virginia School of Law's faculty this month.

Citron is the inaugural Jefferson Scholars Foundation Schenck Distinguished Professor in Law. She comes to UVA from the Boston University School of Law.

"The work that Professor Citron is doing on digital privacy has been truly groundbreaking. It is no exaggeration to say that she has built a field," UVA School of Law Dean Risa Goluboff said. "We are so fortunate to have her join the Law School and UVA. She is the model of the engaged scholar. She is dedicated not only to teaching and the generation of new knowledge, but also to shaping the public discussion and making an impact on the law and larger society."

Citron's professorship is funded by the Jefferson Scholars Foundation, whose stated mission is "to benefit the University of Virginia by identifying, attracting and nurturing individuals of extraordinary intellectual range and depth who possess the highest concomitant qualities of leadership, scholarship and citizenship."

Citron was a recipient of a 2019 MacArthur Foundation Fellowship, informally known as a "genius grant," for her work on cyberstalking and intimate privacy, including her efforts to change how the public thinks about online harassment, from a triviality to a civil rights problem. Because women and minority groups are often targeted online, she has made the argument that civil rights law – which can address harms not covered by other laws – could be applied in many cases and would serve "a crucial expressive role."

"Law would teach us that cyberstalking deprives women and marginalized communities of crucial opportunities to work, speak and go to school," she said. "Regrettably, the United States is as bad as some of the worst countries."

Citron helped found the nonprofit Cyber Civil Rights Initiative in 2013, a name inspired by her foundational paper, "Cyber Civil Rights." She has focused her legislative efforts on a multitude of privacy issues in the online environment. Among them is what has been called "revenge porn," although she and her colleagues at the nonprofit, of which she is the vice president, prefer to reframe the problem as "nonconsensual pornography" in order to emphasize the harm done.

Harvard University Press published her book, "Hate Crimes in Cyberspace," in 2014.

Harvard law professor Jonathan Zittrain hailed the book as a "call to action and a thought-provoking roadmap." University of Chicago law professor Martha Nussbaum said it is a "lucid summary" that shows "we can do quite a lot for victims of cyberabuse without chilling expression."

Cosmopolitan magazine named the book one of the "20 Best Moments for Women" that year.

Through staffers, the book also got the attention of then-California Attorney General Kamala Harris. One of the members of Harris' executive team reached out to Citron. "Our team has read your book. Will you come talk to us?" she asked.

Citron ended up advising Harris and her team for two years, and during a portion of that time, served on the Attorney General's Cyber Exploitation Task Force.

The task force "had a series of meetings, and the first was in-person with 50 companies, like everyone in Silicon Valley," Citron said. "A lot of them didn't necessarily want to be there, but they came – Google, YouTube, Facebook, Twitter, Pinterest, you name it."

In direct and indirect ways, the tech giants were enabling nonconsensual pornography, she said. However, because of Section 230 of the federal Communications Decency Act of 1996, which was the first major attempt by Congress to regulate internet pornography, Harris' office couldn't pursue action against the tech companies for serving as intermediaries. Courts have interpreted the section as meaning operators of internet services cannot be held liable for what third parties post.

Citron gave a presentation on how the companies' reticence to act affected the lives of victims. Google, for example, had a hands-off approach related to its search engine and what content would appear in searches of individuals' names, such as nonconsensual porn. For those instances in which extortion charges couldn't be brought, the Harris team and their allies ultimately sought a way that the content could be taken down or de-listed.

"At the meeting, it was clear that YouTube and Google were very resistant," Citron said.

But celebrities added pressure to the cause: "It was just after the leaking of all of those female celebrities' nude photos from the iCloud hack – Jennifer Lawrence, Gabrielle Union – those prominent women were up in arms about nonconsensual pornography."

Lawrence asserted in an interview with Vanity Fair that nonconsensual pornography is "a sex crime."

"It was incredible," Citron said. "We had that at our backs, and then [Harris] saying, 'Please work on this.' Section 230 prevented her from doing much about it, other than to try to go after perpetrators that had hidden themselves pretty well, and there were thousands and thousands of them."  

By the summer of 2015, Google announced that it would de-index nonconsensual porn in searches of people's names when requested by the affected individual, as did Bing.

"I almost fell over in my chair," Citron said of receiving the news from a USA Today tech reporter. "That's what victims wanted. They don't want this content in searches of their name, because it's impossible to keep a job, have a job and have a life, right?

"So, I credit the AG, now our vice president, with changing the landscape entirely," she said.

In 2016, Citron published an article, "The Privacy Policymaking of State Attorneys General," in the Notre Dame Law Review. While the work was the product of an already-planned research sabbatical, she said, the coincidence of having been invited to assist Harris informed the piece.

In addition to her past service in California, Citron has been a member of Facebook's Non-Consensual Intimate Imagery Task Force since 2011, and an adviser and a member of Twitter's Trust and Safety Task Force since 2009.

She is currently working with U.S. Senate and House staff members on proposed amendments to Section 230, as well as a bill to criminalize digital forgeries – notably those "deepfake" videos that, for example, might make it appear through digital manipulation that someone participated in a pornographic video who did not.

Last year, Citron testified before Congress on both issues. 

In the political realm, she said, the U.S. "dodged a bullet" in the recent presidential election; deepfakes featuring misleading information about political candidates didn't materialize in a significant way. The future may be different, however, she warned. Her TED talk, "How Deepfakes Undermine Truth and Democracy," has been viewed more than 1.9 million times.

Citron has also published dozens of articles related to online privacy and the automation of government decision-making, and has been featured in hundreds more stories for print and broadcast. Her professional awards include recognition from the International Association of Privacy Professionals, as well as from the privacy think tank Future of Privacy.

Her new book project has the working title of "The End of Privacy: How Intimacy Became Data and How to Stop It." At the heart of the book, she said, will be a focus on her theory of intimate privacy – "our ability to manage who has access to and information about our bodies, close relationships, sexual health, gender and sexuality, on and offline."

She wants intimate privacy to be understood as a basic human right.

This year, she received a $75,000 grant from the Knight Foundation to co-lead a study looking for positive impacts of intimate privacy laws on victims.

Citron is an affiliate scholar at the Stanford University Center on Internet and Society, the Yale University Information Society Project and New York University's Policing Project. She is a member of Axon's advisory board on artificial intelligence ethics. As a member of the American Law Institute, she serves as an adviser to the publication "Restatement Third, Information Privacy Principles Project and Restatement (Third) Torts: Defamation and Privacy."

Before turning to academia, she was an associate at Willkie Farr & Gallagher, where she helped the firm draft an updated sexual harassment policy. A graduate of Fordham University School of Law, she clerked at the U.S. District Court for the Southern District of New York for two years after graduation.

UVA Law professor Deborah Hellman, who has known Citron since both were law professors at the University of Maryland, said Citron will fit well in UVA Law's highly collaborative environment.

"She's such a great colleague – generous with her time, always willing to read drafts and super helpful as a critic and editor, interesting to talk with, engaged and fun," Hellman said.

Citron said she looks forward to her office being near Hellman's, as well as the conversations she hopes to have with her and other luminaries on the faculty.  

She also can't wait to meet her new students and teaching assistants, she said. She won the top teaching award her first year at the University of Maryland, in 2005, and proudly includes her student evaluation numbers on her curriculum vitae. As in years past, she expects to teach civil procedure and a seminar on digital privacy. Her most recent seminar featured guest appearances, via Zoom, from all of the authors whose books students read for the course.

"I love my students," Citron said. "Teaching is a part of who I am."

Maryland Nears Country’s First Tax on Big Tech’s Ad Revenue - The New York Times

Posted: 12 Feb 2021 02:00 AM PST

State politicians, struggling with yawning budget gaps from the pandemic, have made no secret about their interest in getting a bigger piece of the tech industry's riches.

Now, Maryland's lawmakers are on the verge of taking a new slice, with the nation's first tax on the revenue from digital advertisements sold by companies like Facebook, Google and Amazon.

The state's House of Delegates voted on Thursday to override the governor's veto of the proposed law, and the State Senate is expected to follow suit. The tax would generate as much as an estimated $250 million in the first year after enactment, with the money going to schools.

The approval would signal the arrival in the United States of a policy pioneered by European countries, and it is likely to set off a fierce legal fight over how far communities can go to tax the tech companies.

Other states are pursuing similar efforts. Lawmakers in Connecticut and Indiana, for example, have already introduced bills to tax the social media giants. Several other states, like West Virginia and New York, fell short of passing new taxes on the tech giants last year, but their proponents may renew their push after Maryland's success.

The moves are part of an escalating debate about the economic power of the tech giants as the companies have grown, become gatekeepers for communication and culture and started to collect reams of data from their users. In the United States, law enforcement agencies brought multiple antitrust cases against Google and Facebook last year. Members of Congress have proposed laws to check their market power, encourage them to moderate speech more carefully and protect their users' privacy.

Maryland's tax also reflects the collision of two economic trends during the pandemic: The largest tech companies have had milestone financial performances as social distancing moved work, play and commerce further online. But cities and states saw their tax revenues plummet as the need for their social services grew.

"They're really getting squeezed," said Ruth Mason, a professor at the University of Virginia's law school. "And this is a huge way to target a tax to the winners of the pandemic."

Lobbying groups for Silicon Valley companies like Google and Facebook have joined other opponents of the law — including Maryland Republicans, telecom companies and local media outlets — in arguing that the cost of the tax will be passed along to small businesses that buy ads and their customers. Doug Mayer, a former aide to Gov. Larry Hogan who now leads a coalition backed by industry opponents of the tax, said at a news conference last week that the law's supporters were "using this bill to take a swing at out-of-state, faceless big corporations."

"But they're swinging and missing and hitting their own constituents in the mouth," he said.

The Maryland tax, which applies to revenue from digital ads that are displayed inside the state, would be based on the ad sales a company generates. A company that makes at least $100 million a year in global revenue but no more than $1 billion a year would face a 2.5 percent tax on its ads. Companies that make more than $15 billion a year would pay a 10 percent tax. Facebook's and Google's global revenues far exceed $15 billion.

Bill Ferguson, a Baltimore Democrat who is president of the State Senate, was a main driver behind the bill. He said he was inspired by an Op-Ed essay from the economist Paul Romer proposing taxing targeted ads to encourage the companies to change their business models.

"This idea that one outsider can exploit and use the personal data of another area and pay nothing for its use, that doesn't work in the long run," Mr. Ferguson said.

Maryland's Democratic-controlled legislature passed the tax with veto-proof majorities last March. But Mr. Hogan, a moderate Republican, vetoed the measure in May.

"With our state in the midst of a global pandemic and economic crash, and just beginning on our road to recovery, it would be unconscionable to raise taxes and fees now," Mr. Hogan said in a letter explaining his reasoning.

Late last year, industry groups helped to form a lobbying organization to try to stop the legislature from overriding Mr. Hogan's veto.

For months, the organization, Marylanders for Tax Fairness, backed by some of Silicon Valley's top lobbying groups, has warned Maryland lawmakers in spots on cable news and local radio that a proposed tax on digital advertisements is a "bad idea" at a "bad time."

The coalition has highlighted the stories of small businesses that it says will ultimately pay the cost of the new tax when they buy online ads.

"A new $250 million tax during a pandemic," said the deep-voiced narrator of one ad over a video of a bar in Annapolis. "Tell your legislator: Stop the digital ad tax."

While some states apply a sales tax to some digital goods and services when they are bought by customers, the Maryland tax would be the first to be applied solely to the revenue a company got from digital advertising in the United States, experts said. The state's lawmakers are expected to approve a second bill in the coming days making clear that the tax does not apply to media companies and that the cost cannot be directly passed along to businesses that buy ads, although critics say the tax would still lead to higher prices for ads.

European policymakers have turned to digital taxes in recent years as part of a larger regulatory push against the American tech giants. France has imposed a 3 percent tax on some digital revenue. Austria taxes income from digital advertising at 5 percent. The European efforts were condemned by the Trump administration, which threatened to impose tariffs on French goods over the issue.

"I don't think the issue's any different in Maryland than it is in California, India, France or Spain," said State Senator James Rosapepe, a Democrat who is the vice chair of the taxation committee. "Given that they're so profitable, they ought to be paying taxes."

Maryland's tax is likely to face court challenges.

Opponents may argue that because the largest tech companies are not based in Maryland, the law would tax activity that originated outside the state, violating the Constitution. They may also argue that the law runs afoul of a federal law that says taxes on digital goods or services must also apply to equivalent physical products.

"It's tax discrimination," said Dave Grimaldi, the executive vice president for public policy at IAB, an online advertising trade group. "There will be all manner of challenges as soon as it is enacted."

But the law's backers said they believed they were on solid ground to start taxing the giants.

"We anticipate that, even in overriding, it is likely that the industry will file a lawsuit," Mr. Ferguson said. He said lawmakers had asked the state attorney general's office if it felt it could defend the law.

"And they did," he said. "They signed off."

Pressure builds on Facebook Oversight Board - POLITICO - Politico

Posted: 11 Feb 2021 07:00 AM PST

With help from Zach Montellaro, Andy Blatchford, John Hendel, Steven Overly and Leah Nylen

PROGRAMMING NOTE: Morning Technology will not publish on Monday, Feb. 15. We'll be back on our normal schedule on Tuesday, Feb. 16. Please continue to follow Pro Technology.

Editor's Note: Morning Tech is a free version of POLITICO Pro Technology's morning newsletter, which is delivered to our subscribers each morning at 6 a.m. The POLITICO Pro platform combines the news you need with tools you can use to take action on the day's biggest stories.Act on the news with POLITICO Pro.

— Scoop: Facebook's former security chief is among a group of prominent lawyers and academics urging the independent Facebook Oversight Board not to let former president Donald Trump back on the platform.

— Happening this morning: House Energy and Commerce is voting on a measure that would tuck into the next Covid relief package billions in funding to subsidize resources for students having to take classes online.

— TikTok timeline: There isn't one. The Biden team put the former administration's actions against the Chinese-owned video app on hold, upping the intrigue around the new White House's approach to Beijing.

GREETINGS, TECHLINGS: IT'S THURSDAY. WELCOME TO MORNING TECH! I'm your host, Alexandra Levine.

Got a news tip? Write me at [email protected], and follow @Ali_Lev on Twitter and @alexandra.levine on Instagram. An event for our calendar? Send details to [email protected]. Anything else? Team info below. And don't forget: Add @MorningTech and @PoliticoPro on Twitter.

COMMENTS ON TRUMP SUSPENSION FLOOD OVERSIGHT BOARD — Facebook's former chief security officer is among a group writing today to the Facebook Oversight Board urging the independent body against allowing Trump back on the platform. (Friday is the deadline for public input on the board's case on whether the social network should reverse its decision to indefinitely suspend the former president.) The letter, shared first with POLITICO via Morning Score, is being submitted to the oversight board in response to its request for public comment.

— Facebook veteran Alex Stamos, who is now director of the Stanford Internet Observatory, signed onto the letter led by election expert Rick Hasen that said Facebook had made "the correct decision." While they acknowledged the dangers of banning a political leader, "Trump's actions justified the step of indefinitely deplatforming him," they wrote, concluding: "There no doubt will be close calls under a policy that allows the deplatforming of political leaders in extreme circumstances. This was not one of them."

— Among the other signees were Janai Nelson of the NAACP-LDF and Norm Ornstein, an emeritus scholar at the American Enterprise Institute. And they made clear they also hold Facebook responsible: "Without social media spreading Trump's statements," they wrote, "it seems extremely unlikely these events would have occurred."

— Keep an eye out: Oversight Board co-chair Helle Thorning Schmidt, the former Danish prime minister, and its head of communications, Dex Hunter-Torricke, are speaking this morning at a Carnegie Endowment event on Facebook's Trump ban.

IN OTHER FACEBOOK NEWS — The social giant says it will temporarily reduce the amount of political content in news feeds for some users starting this week. The platform's initial test is being done for a small percentage of people in Canada, Brazil and Indonesia, followed by the U.S. in the coming weeks. These are the countries where users most commonly told the company there was too much political content in their feeds, Facebook Canada's Erin Taylor told my colleague, Andy Blatchford, in a statement.

— There's a robot for that: For these first test cases, Facebook will try leaning on machine learning to identify the political content. They'll use "a machine learning model that is trained to look for signals of political content and predict whether a post is related to politics," Taylor said. "We'll be refining this model during the test period to better identify political content, and may or may not end up using this method longer-term."

— Going, but not gone: Facebook may temporarily cut back on the distribution of political content in news feeds, but it won't remove political content from the platform altogether, Taylor said: "Our goal is to preserve the ability for people to find and interact with political content on Facebook, while respecting each person's appetite for it at the top of their News Feed."

TODAY: HOUSE E&C VOTES ON COVID-ERA INTERNET BOOST — House Energy and Commerce lawmakers vote today on legislative recommendations to have the FCC administer a $7.6 billion Emergency Connectivity Fund that would help pay for households' purchases of laptops, tablets and at-home connectivity devices (like Wi-Fi hotspots, modems and routers) during the pandemic. The provision, slating money intended to help schools and libraries offer at-home internet connectivity, is one of many in the pandemic relief legislation that falls into E&C's jurisdiction.

— Cheerleaders across the dome: Democratic Sens. Ed Markey (Mass.), Chris Van Hollen (Md.), Michael Bennet (Colo.) and Maggie Hassan (N.H.) joined Rep. Grace Meng (D-N.Y.) in applauding the provisions and urging the fund's retention in a final Covid aid bill.

— Building on E-Rate: The provision builds on a long-running push to expand the FCC's existing subsidy program called E-Rate, which helps connect schools and libraries. And incoming Senate Commerce Chair Maria Cantwell (D-Wash.) also wants the type of money the House lawmakers are eyeing, she told reporters Tuesday: "The next focus of attention will be getting more money in the E-Rate program because we have 12 million kids who still can't get access to broadband advantages that they deserve." (Her panel will vote on its organizing rules this morning, officially giving her the gavel.)

— Around the corner: E&C announced plans to hold a hearing next week on expanding internet access during the pandemic.

TEAM BIDEN HALTS TRUMP'S TIKTOK BATTLE — The (seemingly never-ending) saga over the future of popular video-sharing app TikTok may finally be fizzling out. The Biden administration asked a federal appeals court on Wednesday to freeze the legal battle it inherited with the Chinese company as the new government conducts a broader review of Trump's policies related to China. (That includes an executive order that sought to banish TikTok and its parent company, ByteDance, from the U.S.) ByteDance did not oppose the Biden team's request.

— What about Oracle? The Wall Street Journal also reported Wednesday that the deal TikTok had been negotiating to sell part of its U.S. operations to Oracle and Walmart has been suspended indefinitely. That deal was being forced by the Trump administration, which threatened to boot ByteDance if it did not find an American buyer for its U.S. business.

— View from the White House: White House press secretary Jen Psaki said Wednesday that the Biden administration has yet to take action on TikTok, and declined to provide a timeline for when it might do so. "I will note, broadly speaking, that we are comprehensively evaluating the risks to U.S. data, including from TikTok, and will address them in a decisive and effective fashion," she said. "If we have news to announce, we will announce it." Congressional China hawk Sen. Rick Scott described this as backtracking and "weakness toward Communist China."

— View from the former CEO: "The Trump administration didn't think out what they were doing very well," former TikTok CEO Kevin Mayer said Wednesday on Fox Business.

SLAUGHTER'S FAMOUS FIRST WORDS: MAKE PRIVACY VIOLATORS DELETE DATA — In her first major speech as acting FTC chair, Rebecca Kelly Slaughter said the agency should force companies that violate consumer privacy laws to give up data obtained illegally.

— Lead by example: Slaughter highlighted the FTC's January settlement with Everalbum — in which the agency forced the photo app to delete consumer data and a facial recognition algorithm it created without user consent — as a model for future enforcement. "Where companies collect and use consumers' data in unlawful ways, we should require violators to disgorge not only the ill-gotten data, but also the benefits — here, the algorithms — generated from that data," she said Wednesday at a Future of Privacy Forum event.

— Other priorities? Slaughter also noted two other privacy enforcement areas that she wants the FTC to focus on: biased and discriminatory algorithms, and facial recognition. Both can exacerbate existing racial disparities, she said, citing studies that found that health algorithms prioritize care for white patients over sicker, Black patients and incidents where faulty facial recognition has led to the wrongful arrests of Black men. Slaughter also expressed concerns about surveillance and the collection of location information about protestors.

Tarika Barrett, chief operating officer of Girls Who Code, will in April succeed Reshma Saujani as the organization's CEO; Saujani will stay on as board chair. … Google veteran Brittany Smith, who most recently ran partnerships and engagement on AI policy, human rights and racial justice at DeepMind, this month became Data & Society's first policy director.

Government crackdown: "A Tencent executive has been held by Chinese authorities for alleged unauthorized sharing of personal WeChat data," WSJ reports.

Will accept payment in bitcoin: "Twitter's finance chief said the social-media company has thought about how it might pay employees or vendors using the popular cryptocurrency bitcoin," WSJ reports.

Worth a close look: The Atlantic Council's Digital Forensic Research Lab and Just Security published "#StopTheSteal: A Timeline of Social Media and Extremist Activities Leading up to 1/6 Insurrection." The in-depth investigation and comprehensive chronology around #StopTheSteal tracks the movement from the 2016 election through the Capitol violence in January.

California, Washington, Virginia… it's all happening: "New state privacy initiatives turn up heat on Congress," The Hill reports.

SCOTUS watch: "Suing technology firms when they mess up is already hard, especially over privacy violations. Now, Facebook, Google, and the trade groups representing all the big tech firms are asking the Supreme Court to make it even harder for class actions to pursue cases against them," ArsTechnica reports.

Tips, comments, suggestions? Send them along via email to our team: Bob King ([email protected], @bkingdc), Heidi Vogt ([email protected], @HeidiVogt), Nancy Scola ([email protected], @nancyscola), John Hendel ([email protected], @JohnHendel), Cristiano Lima ([email protected], @viaCristiano), Alexandra S. Levine (a[email protected], @Ali_Lev), and Leah Nylen ([email protected], @leah_nylen).

TTYL.

Free decrypter released for Avaddon ransomware victims... aaand, it's gone! - ZDNet

Posted: 11 Feb 2021 09:16 AM PST

avaddon-ransomware.png
Image: ZDNet

A Spanish student released a free decryption utility that can help victims of the Avaddon ransomware recover their files for free.

ZDNet Recommends

Published on GitHub by Javier Yuste, a student at the Rey Juan Carlos University in Madrid, the AvaddonDecrypter works only in cases where victims have not powered off their computers.

The tool works by dumping an infected system's RAM and scouring the memory content for data that could be used to recover the ransomware's original encryption key.

If enough information is recovered, the tool can then be used to decrypt files and help victims recover from Avaddon attacks without needing to pay the gang's ransom demand.

Avaddon gang fixes their code

But while the tool's release will most likely help past victims, it won't be helping companies that fall victim to new Avaddon attacks.

This is because the tool's release did not go unnoticed. In a forum post on Wednesday, the Avaddon gang said it also learned of Yuste's decrypter and has already deployed updates to its code, effectively negating the tool's capabilities.

avaddon-message.png
Image: ZDNet

The Avaddon team's reaction mirrors how the Darkside ransomware crew also answered the release of a similar decrypter for their own strain last month, in January.

darkside-answer.png
Image: KELA

Infosec experts: Keep some ransomware decrypters private!

In the end, the release of both decryption utilities had a very limited impact. While a few victims were able to decrypt files, once the existence of the decryption tool was made public, the ransomware gangs analyzed how the tools worked and fixed their code within days.

The release of these two tools, along with a blog post from Dutch security firm Eye Control showing how victims could recover from attacks with the Data Doctor ransomware, has rekindled, once again, a years-long conversation in the cyber-security industry about how decryption utilities should be handled and released to victims.

Several prominent security researchers with a long history of helping ransomware victims since the mid-2010s have made their opinions known again over these past two months, highlighting the fact that decryption utilities that take advantage of ransomware encryption bugs should be kept private and distributed to victims via non-public channels rather than advertised online.

Furthermore, even if such tools need to be made public, there should not be any technical details that accompany the tool's release, details that will obviously help the attackers patch their own code as well.

On the other side, decryption utilities that are built around master decryption keys obtained from the attackers' servers are OK to share online, as there's little that ransomware authors can do about these tools.

All in all, seeing how the Avaddon and Darkside groups have reacted —by fixing their encryption schemes within days— it's hard to argue against the arguments made online over the past two months, namely that some decryption tools should never make it into the public domain.

Comments

Popular posts from this blog

For inbound college students — and universities — fall semester presents new choices and dilemmas - Pittsburgh Post-Gazette

Baker Technical Institute launches Certified Medical Assistant program - Blue Mountain Eagle

Excelsior College Named Graduating and Transfer University for Study.com - Yahoo Finance