Avsnitt

  • Jonathan Jackson is a co-founder of Blavity Inc., a technology and media company for black millennials. Blavity’s mission is to "economically and creatively support Black millennials across the African diaspora, so they can pursue the work they love, and change the world in the process." Blavity has grown immensely since their founding in 2014 — among other things, spawning five unique sites, reaching over 7 million visitors a month, and organizing a number of technology, activism, and entrepreneurship conferences.Jonathan Jackson is also a Joint Fellow with the Nieman Foundation and the Berkman Klein Center for Internet & Society for 2018-2019. During his time here, he says, he is looking for frameworks and unique ways to measure black cultural influence (and the economic impact of black creativity) in the US and around the world.Jonathan sat down with the Berkman Klein Center’s Victoria Borneman to talk about his work.Music from this episode:"Jaspertine" by Pling - Licensed under Creative Commons Attribution Noncommercial (3.0)More information about this work, including a transcript, can be found here:https://cyber.harvard.edu/story/2019-01/get-know-berkman-klein-fellow-jonathan-jackson

  • Berkman Klein Center interns sat down with 2018 Berkman Klein Center Fellow Amy Zhang, to discuss her work on combating online harassment and misinformation as well as her research as a Fellow.

  • Saknas det avsnitt?

    Klicka här för att uppdatera flödet manuellt.

  • According to a recent Pew Research Center study, Instagram is the second most popular platform among 13 to 17-year-olds in the US, after YouTube. Nearly 72 percent of US teenagers are on the image sharing platform.Our Youth & Media team looked at how teens are using Instagram to figure out who they are. While seemingly just a photo-sharing platform, users have molded Instagram into a more complex social media environment, with dynamics and a shared internal language almost as complex as a typical middle or high school.This episode was produced by Tanvi Kanchinadam, Skyler Sallick, Quinn Robinson, Jessi Whitby, Sonia Kim, Alexa Hasse, Sandra Cortesi, and Andres Lombana-Bermudez.More information about this work, including a transcript, can be found here:https://cyber.harvard.edu/story/2018-11/how-youth-are-reinventing-instagram-and-why-having-multiple-accounts-trending

  • We encounter algorithms all the time. There are algorithms that can guess within a fraction of a percentage point whether you’ll like a certain movie on Netflix, a post on Facebook, or a link in a Google search.But Risk Assessment Tools now being adopted by criminal justice systems all across the country - from Arizona, to Kentucky, to Pennsylvania, to New Jersey - are made to guess whether you’re likely to flee the jurisdiction of your trial, or commit a crime again if you are released.With stakes as high as this — human freedom — some are asking for greater caution and scrutiny regarding the tools being developed.Chris Bavitz, managing director of the Cyberlaw Clinic at Harvard Law School, helped draft an open letter to the state legislature of Massachusetts about Risk Assessment Tools, co-signed by a dozen researchers working on the Ethics and Governance of Artificial Intelligence. He spoke with Gretchen Weber about why we need more transparency and scrutiny in the adoption of these tools.Read the open letter here: https://cyber.harvard.edu/publications/2017/11/openletter

  • Even before Election Day, 2016, observers of technology & journalism were delivering warnings about the spread of fake news. Headlines like “Pope Francis Shocks World, Endorses Donald Trump For President” and “Donald Trump Protestor Speaks Out, Was Paid $3500 To Protest” would pop up, seemingly out of nowhere, and spread like wildfire.Both of those headlines, and hundreds more like them, racked up millions of views and shares on social networks, gaining enough traction to earn mentions in the mainstream press. Fact checkers only had to dig one layer deeper to find that the original publishers of these stories were entirely fake, clickbait news sites, making up false sources, quotes, and images, often impersonating legitimate news outlets, like ABC, and taking home thousands of dollars a month in ad revenue. But by that time, the damage of fake news was done - the story of the $3500 protestor already calcified in the minds of the casual news observer as fact.It turns out that it’s not enough to expect your average person to be able to tell the difference between news that is true and news that seems true. Unlike the food companies who create the products on our grocery shelves, news media are not required by law to be licensed, inspected, or bear a label of ingredients and nutrition facts, not that they should or could be.But the gatekeepers of news media that we encounter in the digital age - the social media platforms like Facebook and Twitter, search engines like Google, and content hosts like YouTube - could and should be pitching in to help news consumers navigate the polluted sea of content they interact with on a daily basis.That’s according to Berkman Klein Center co-founder Jonathan Zittrain and Zeynep Tufekci, a techno-sociologist who researches the intersection of politics, news, and the internet. They joined us recently to discuss the phenomenon of fake news and what platforms can do to stop it.Facebook and Google have recently instituted to processes to remove fake news sites from their ad networks. And since this interview Facebook has also announced options allowing users to flag fake news, and a partnership with the factchecking website Snopes to offer a layer of verification on questionable sites.For more on this episode visit:https://cyber.harvard.edu/interactive/radioberkman238CC-licensed content this week:Neurowaxx: “Pop Circus” (http://ccmixter.org/files/Neurowaxx/14234)Photo by Flickr user gazeronly (https://www.flickr.com/photos/gazeronly/10612167956/)

  • The effects of surveillance on human behavior have long been discussed and documented in the real world. That nervous feeling you get when you notice a police officer or a security camera? The one that forces you to straighten up and be on your best behavior, even if you're doing nothing wrong? It's quite common.The sense of being monitored can cause you to quit engaging in activities that are perfectly legal, even desirable, too. It's a kind of "chilling effect." And it turns out it even happens online.Researcher Jon Penney wanted to know how the feeling of being watched or judged online might affect Internet users' behavior. Does knowledge of the NSA's surveillance programs affect whether people feel comfortable looking at articles on terrorism? Do threats of copyright law retaliation make people less likely to publish blog posts?Penney's research showed that, yes, the chilling effect has hit the web. On today's podcast we talk about how he did his research, and why chilling effects are problematic for free speech and civil society.Creative Commons photo via Flickr user fotograzio (https://www.flickr.com/photos/fotograzio/23587980033/in/photolist-BWoyEV-9sXJor-65C6vW-6HoVLi-CniMF-6Nmu25-a2vLRE-8EjbLa-5oemP4-2WnPir-68wN7D-qUAEUo-5WMdZy-CniTa-7SRuk-8wuiLW-ngSxx-auStar-7hHVm2-wZdZ-8WxYa6-6sHDJQ-8jMspN-fuEUnL-7F4sHR-npc6W-ngSz2-5YcUcm-oD777V-gyXGQj-9YzJSh-7A3qBq-gyXGYW-7mxD65-UnAYc-nsSXr-UnAHZ-oB5nAs-oD5sv1-omBFNa-BZRnk-4eugbA-4Mm6sa-4Mqh55-4Mqhju-4ikSHh-7RcGHj-6GAFVT-eApV5g-PgHJU)Find out more about this episode here:https://cyber.law.harvard.edu/node/99495

  • "George Lucas built a whole new industry with Star Wars." says Peter S. Menell, devoted science fiction fan and a professor at the UC Berkeley School of Law, who studies copyright and intellectual property law. "But what funds that remarkable company is their ways of using copyright."And he's right. A third of the profits LucasFilm pulls in from Star Wars has come from merchandising alone (http://www.forbes.com/sites/aswathdamodaran/2016/01/06/intergalactic-finance-how-much-is-the-star-wars-franchise-worth-to-disney/#74c6181b2d79). Not ticket sales, not DVDs, not video games or books. Toys, clothes, and weird tie-ins like tauntaun sleeping bags and wookie hair conditioner.But fans of Star Wars, and other stratospherically profitable creative universes, increasingly like to become creators within those universes. They write books, they make costumes, they direct spinoffs and upload them to YouTube.And sometimes they make money.How does law come into play when fans start to reinterpret intellectual property? We sat down with Menell to see where the tensions lie between the law, the courts, and the George Lucases of the world.Creative Commons music used in this episode:David Szesztay “Morning One” Broke For Free “Something Elated”Image courtesy of Flickr user: kalexanderson https://flic.kr/p/b7YWCDSubscribe to us on iTunes: https://itunes.apple.com/us/podcast/radio-berkman/id298096088?mt=2

  • In her article "The Secret Lives of Tumblr Teens," Elspeth Reeve tells the stories of some of Tumblr's most popular bloggers -kids who started their blogs in high school, made a ton of money and then inexplicably disappeared. In this episode we talk to Reeve about what she discovered when she went looking for these teens and what that can tell you about Tumblr and the teenage child stars of the Internet. Read Reeve's article here: https://newrepublic.com/article/129002/secret-lives-tumblr-teens Credits: Produced byDaniel Dennis Jones and Elizabeth Gillis Music byDave Depper ("Rare Groove," "Sharpie," "Heartstrings"), Podington Bear ("Golden Hour") and Anitek ("Beta Blocker")

  • An artist, musician, or writer can’t just take another person's creation and claim it as their own. Federal law outlines how creators can and can’t borrow from each other. These rules are collectively called "copyright law," and essentially they give creators the exclusive right to copy, modify, distribute, perform, and display their creative works. Copyright law was originally created as an incentive. If creators aren’t worrying about whether someone might steal their work, they’re more likely to share their ideas with the public. This kind of sharing in turn helps to create more ideas, products, jobs, art, and whole industries.But even with copyright there are exceptions, or times where another artist can use a copyrighted work within getting the copyright holder’s permission. This safe zone is called "Fair Use."On this episode of the podcast we'll tell you everything you need to know about Fair Use in 6 minutes!Reference SectionPhoto courtesy of Fair Use/Fair Dealing Week Music courtesy of “Beta Blocker” -AnitekThis week's episode was written by Leo Angelakos, Elizabeth Gillis, Daniel Dennis Jones, and Olga Slobodyanyuk, and edited by Elizabeth Gillis.Visit http://www.fairuseweek.org for even more information and resources on Fair UseVisit http://dlrp.berkman.harvard.edu/ for information on how to incorporate digital resources and fair use friendly practices in classroomsSpecial thanks this week to Andres Lombana-Bermudez of the Youth & Media Team, and Chris Bavitz of the Cyberlaw Clinic.For more information on this episode, including a transcript, visit http://cyber.law.harvard.edu

  • Are you really "you" online?We asked around for stories of digital alter egos — secret identities that people maintain on the web and try to keep separate from their real life identities.And it turns out there are lots of reasons — some good, some nefarious, some maybe both — to have an alternate persona online.On this episode we share stories of Catfishers, sock puppets, and digital doppelgangers.Reference SectionPhoto courtesy of carbonnycMusic courtesy of Podington Bear, MCJackinthebox, Blue Dot Sessions, and David SzesztayThis week's episode produced by Daniel Dennis Jones and Elizabeth Gillis, with oversight from Gretchen Weber, and extra help from Adam Holland, Tiffany Lin, Rebekah Heacock Jones, Annie Pruitt, and Carey Andersen.More info on this story here: https://cyber.law.harvard.edu/node/99359

  • You've likely heard of Silk Road - the black market e-commerce hub that was shutdown in 2013 for becoming a magnet for vendors of illicit goods. But the story of its shutdown, and the investigation and trial that followed, is complicated enough that we need a guide.On this week's podcast Berkman Affiliate Hasit Shah brought together members of the Berkman community to speak with journalist and legal expert Sarah Jeong about what it was like to follow the Silk Road trial, and how the justice system copes when technology becomes a central part of a case.****Listeners: We need your stories! Was there ever a time you used the web to be anonymous? Have you ever had a digital alter ego? If you’ve ever used a blog or a social media account to do something you didn't want connected to your real identity, we want to hear about it! We’ve set up a special hotline. All you have to do is call-in and tell your story on our voicemail, and we'll feature you on an upcoming episode. (617) 682-0376.

  • On your computer, you don’t ever really "take out the trash." Data doesn’t get picked up by a garbage truck. It doesn’t decompose in a landfill.It just accumulates.And because space is becoming less and less of an issue -- hard drive space keeps getting cheaper, and a lot of the apps we use have cloud storage anyway -- deleting our files is a thing of the past.We become Digital Hoarders.But what happens when we dig up those old files from years ago? Those old emails from our boyfriend or girlfriend, those old digital photos of family, those long rambling journal entries?On this week's podcast we talk to three researchers who all have different stories of digital hoarding, deleting, and recovering.Jack Cushman, Judith Donath, and Viktor Mayer-Schönberger talk about the value of remembering, the value of forgetting, and what we trust to our machines.More information on this episode, including links and credits, here: https://cyber.law.harvard.edu/node/99207

  • Facebook has had a lot of trouble with misogynistic speech. A few years ago, several women’s groups joined together to petition Facebook to work harder to block misogynistic pages, posts, and replies. At the time Facebook had strict standards against hate speech that was racist or anti-semitic — such speech would be blocked or taken down. These groups simply asked that gendered hate speech receive the same treatment.It was ironic, people said, that Facebook would commonly take down photos of women breastfeeding in response to complaints. Such content was deemed pornographic. But when Facebook users complained about comments that were misogynistic or harassing women, Facebook defended their decisions not to take them down. Their reasoning was one of semantics: Comments that described gendered violence didn’t actually threaten violence, they would argue. But — campaigners pointed out — misogynistic content actually is threatening, and creates an unsafe environment for speech.The campaigners won. But this isn’t the first time Facebook’s policies on censorship have been questioned by the public. And it won’t be the last.Right now, many European countries are asking Facebook to more strictly police hate speech on the platform.Jillian York is a writer and the director for International Freedom of Expression at the Electronic Frontier Foundation. She joined us to talk about the most recent debates about online speech, and why she questions whether these kinds of decisions should be left up to Facebook at all.Find more information on this episode, and subscribe to the podcast, here: https://cyber.law.harvard.edu/node/99190

  • A recent New York Times survey of the top 50 news sites showed that blocking ads while surfing their mobile news sites could save up to 14 megabytes per page loaded. 14 megabytes adds up to 30 seconds over 4G, and, if you’re on a restricted data plan, it would cost you 30 cents per page, all of that money going to your mobile provider, not to the content publisher.But for content publishers, and the ad providers that keep them alive, ad blocking poses a huge problem. Most of the commercial web as we know it exists because of advertising. When web users aren’t loading ads on their favorite ad-supported site, or otherwise paying the site - by subscribing, sponsoring, buying merchandise - the site is losing out on cash.And we’re talking serious cash. Digital ad spending is expected to reach $170.17 billion in 2015, with $69 billion - 40% of ad spending - in the mobile space.That’s a lot of money to spend on ads that might not even be seen. Ad block software is now in use by 200 million people around the globe.Doc Searls is a journalist and author who worked in the ad industry years ago. He has referred to ad blocking as “the biggest boycott in human history.”Radio Berkman producer Elizabeth Gillis spoke with Searls about what’s going on in the Ad Block Wars, and the part played by users, like you.More info on this episode, including links and credits, here: https://cyber.law.harvard.edu/node/99177

  • The current generation of search engines just tells you where to find information (returns a list of webpages). The next generation of search engines could anticipate what you are searching for, and actually find the information for you.In this conversation, Cynthia Rudin — associate professor of statistics at the Massachusetts Institute of Technology and director of the Prediction Analysis Lab — leads a brainstorming session to envision the future of the search engine.Challenges for the audience:* What are some cases where Google fails miserably?* Do you sometimes want to find the answer to a complicated type of question? How do you envision the answer being presented?* Do you have ideas of what the capabilities will be for the next generation of search engines?* What are some lessons we can learn from search engines from the past? This will be mainly audience participation.More information on this event here:https://cyber.law.harvard.edu/node/99130

  • Ethnographer Whitney Phillips embedded with the trolls of 4chan, observing for years how anonymous members of its subversive "b" forum memed, pranked, harassed, and abused, all for the "lolz" — the thrill of doing something shocking.The result: a book, "This is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture," that sheds light on how and why trolls do what they do.More than pushing the boundaries of taste within themselves — the "b" board recently made headlines for a case in which anonymous members allegedly goaded one of their own to cut off his own toe — troll behavior has had an incredibly broad impact on society. Trolling shaped the way social platforms and conversations on public forums take place. It is in no small part due to the spread of troll culture that comments sections, Facebook threads, and Twitter conversations can be minefields to productive conversation; the troll dialect is better equipped for shock and ironic bigotry than for sincerity, and a sincere conversation is just begging to be disrupted, especially when you disagree with your target.But while wrench-throwing can and has been a very important tool in online discourse, the web has started to outgrow trolls. In 2003 when 4chan was launched, there were under 700 million people on the Internet (predominantly higher income, younger, white, Western, male, and native English speakers), compared to 3.2 billion people today from many backgrounds. The incredible diversity of individuals all trying to have conversations on the same platforms has increased demand for civility, understanding, and inclusiveness, even as the conversations can seem more and more cacophonously problematic. And this threatens to make trolling less funny.Whitney joins us this week to talk about how troll culture has changed over the years, and what platforms can do to temper darker forms of discourse.For more on this week's episode visit: https://cyber.law.harvard.edu/node/99117

  • Bitcoin is having its 7th birthday, and its promise to change the way the world thinks about money is looking less and less hyperbolic.For one, the block chain technology underlying Bitcoin - the public ledger that makes the exchange transparent and accountable - is now being used to clean up Wall Street. A block chain-inspired service announced recently could open up the practice of lending stocks, and help prevent the kind of out-of-control short selling that led to the crash of 2008.But there are a lot people still don't understand about Bitcoin and block chain. We spoke with incoming Berkman Fellow Patrick Murck of the Bitcoin Foundation to explain.Flickr photo courtesy of btckeychain

  • The market for recorded music has undergone at least three major reinventions since the dawn of the Internet. At the turn of the century illegal downloading ate away at the music industry’s bottom line. Then the iTunes music store made it easy to buy music again, albeit disaggregated from its album form.Then along came streaming. The combination of ubiquitous Internet connectivity and bottomless consumer appetite for music has led to the success of applications like Pandora, Spotify, and Rdio which allow users to access entire music catalogs from virtually anywhere for next to nothing.Streaming has worked. In 2014 alone, at least 164 billion tracks were played across all streaming services according to Nielsen. And these streaming companies are raking in incredible amounts of cash from advertising and user subscription fees.Where does the money go? A recent study from Berklee College of Music’s Institute for Creative Entrepreneurship showed that 20 to 50 percent of music revenues might never make it to their rightful owners. In some cases artists might get 20% or less of the already tiny dollar amounts coming in from streaming services.But no one knows for sure.In a New York Times Op-Ed this week David Byrne asked the music industry to “open the black box,” and let everyone - the artists, the labels, the distributors, the listeners - know exactly where your money goes.On this week’s episode of the podcast we try to find out if we can crack into the stream and figure out where the money is flowing.CC photo courtesy of Flickr user hobvias sudoneighmFind a transcript of this episode here:https://docs.google.com/document/d/1b_vhqKu3OVVOOPddj64HZLQthe5k4pzinsF5XmYpJ7M/edit?usp=sharing

  • With 316 million users posting 500 million tweets a day, someone is bound to write an unoriginal tweet now and then.But there are some Twitter users whose entire existence relies completely on plagiarizing tiny jokes and relatable observations created by other Twitter users. Many plagiarizing accounts have follower numbers ranging from the thousands to the millions. Meaning their exposure can lead to career opportunities and sponsorships built on the creativity of others who are just getting started in their writing careers.So it was not without excitement that Twitter users found out last week that they can report plagiarizing accounts to Twitter under the Digital Millennium Copyright Act, and have these copied tweets removed.But now we're forced to ask the question: are jokes protected under copyright?We asked Andy Sellars of Harvard Law School's Cyberlaw Clinic to weigh in.Flickr photo courtesy of wwworksMusic from Podington Bear "Bright White