Amy Zhang – 91±ŹÁÏ News /news Thu, 09 May 2024 15:46:11 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Can Wikipedia-like citations on YouTube curb misinformation? /news/2024/05/09/wikipedia-citations-youtube-misinformation-viblio/ Thu, 09 May 2024 15:46:11 +0000 /news/?p=85401 A computer screen with the YouTube logo, a red rectangle with a triangle in it, above links to "Home" and "Trending"
91±ŹÁÏ researchers created and tested a prototype browser extension called Viblio, which lets viewers and creators add Wikipedia-like citations to YouTube videos. Photo:

While Google has long been synonymous with search, people are increasingly seeking information directly through video platforms such as YouTube. Videos can be dense with information: text, audio, and image after image. Yet each of these layers presents a potential source of error or deceit. And when people search for videos directly on a site like YouTube, sussing out which videos are credible sources can be tricky.

To help people vet videos, 91±ŹÁÏ researchers created and tested Viblio, a browser extension that lets viewers and creators add Wikipedia-like citations to YouTube videos. The prototype offers users an alternate timeline, studded with notes and links to sources that support, refute or expand on the information presented in the video. Those links also appear in a list view, like the “” section at the end of Wikipedia articles. In tests, 12 participants found the tool useful for gauging the credibility of videos on topics ranging from biology to political news to COVID-19 vaccines.

The team will present May 14 in Honolulu at the ACM CHI Conference on Human Factors in Computing Systems. Viblio is not available to the public.

“We wanted to come up with a method to encourage people watching videos to do what’s called ‘lateral reading,’ which is that you go look at other places on the web to establish whether something is credible or true, as opposed to diving deep into the thing itself,” said senior author , an assistant professor in the Paul G. Allen School of Computer Science & Engineering. “In previous research, I’d worked with the people at X’s and with Wikipedia and seen that crowdsourcing citations and judgments can be a useful way to call out misinformation on platforms.”

A YouTube panel shows a timeline with four circles on it. Below it is a link to an article from The Guardian about Rudy Giuliani.
Viblio offers users an alternate timeline, studded with notes and links to sources that either support or refute the information presented in the video. Photo: Hughes et al./CHI 2024

To inform Viblio’s design, the team studied how 12 participants — mostly college students under 30 — gauged the credibility of YouTube videos when searching for them on the platform and while watching them. All said familiarity with the video’s source and the name of the channel were important. But many cited signs of a video’s potentially faulty credibility: the quality of the video, the user’s degree of interest in it, its ranking in search results, its length and the number of views or subscribers.

The team also found that in one case a participant misinterpreted a YouTube information panel as an endorsement of the video from the Centers for Disease Control and Prevention. But these panels are actually links to supplemental information that the site attaches to videos on “topics prone to misinformation.”

“The trouble is that a lot of YouTube videos, especially more educational ones, don’t offer a great way for people to prove they’re presenting good information,” said , a doctoral student at University of Notre Dame who completed this research as a 91±ŹÁÏ undergraduate student in the Information School. “I’ve stumbled across a couple of YouTubers who were coming up with their own ways to cite sources within videos. There’s also not a great way to fight bad information. People can report a whole video, but that’s a pretty extreme measure when someone makes one or two mistakes.”

The researchers designed Viblio so users can better understand videos’ content while also avoiding things like users misinterpreting the additional information. To add a citation, users click a button on the extension. They can then add a link, select the timespan their citation references and add optional comments. They can also select the type of citation, which marks it with a colored dot in the timeline: “refutes the video clip’s claim” (red), “supports the video clip’s claim” (green) or “provides further explanation” (blue dot).

A panel marked “Citations” lets users click boxes such as “refutes the video clip’s claim,” “supports the video clip’s claim” or “provides further explanation.” There are also spaces for a link, adjusting a timespan and adding comments.
To add citations, users click on a button which presents the options shown here. Photo: Hughes et al./CHI 2024

To test the system, the team had the study participants use Viblio for two weeks on a range of videos, including clips from Good Morning America, Fox News and ASAPScience. Participants could add citations as well as watch videos with other participants’ citations. For many, the added citations changed their opinion of certain videos’ credibility. But the participants also highlighted potential difficulties with deploying Viblio at a larger scale, such as the conflicts that arise in highly political videos or those on controversial topics that don’t fall into true-false binaries.

“What happens when people with different value systems add conflicting citations?” said co-author , a 91±ŹÁÏ assistant professor in the Information School. “We of course have the issue with bad actors potentially adding misinformation and incorrect citations, but even when the users are acting in good faith, but have conflicting options, whose citation should be prioritized? Or should we be showing both conflicting citations? These are big challenges at scale.”

The researchers highlight a few areas for further study, such as expanding Viblio to other video platforms such as TikTok or Instagram; studying its useability at a greater scale to see whether users are motivated enough to continue adding citations; and exploring ways to create citations for videos that don’t get as much traffic and thus have fewer citations.

“Once we get past this initial question of how to add citations to videos, then the community vetting question remains very challenging,” Zhang said. “It can work. At X, Community Notes is working on ways to prevent people from ‘gaming’ voting by looking at whether someone always takes the same political side. And Wikipedia has standards for what should be considered a good citation. So it’s possible. It just takes resources.”

Additional co-authors on the paper include , who completed this work as an undergraduate at the 91±ŹÁÏ and is now at Microsoft; , who completed this research as a 91±ŹÁÏ doctoral student in the iSchool and is now an assistant professor at Seattle University; and , who completed this work as a 91±ŹÁÏ graduate student in human centered design and engineering and is now a doctoral student at University of California San Diego. This research was funded by the WikiCred Grants Initiative.

For more information, contact Zhang at axz@cs.uw.edu, Hughes at ehughes8@nd.edu, and Mitra at tmitra@uw.edu.

]]>
Three 91±ŹÁÏ teams awarded NSF Convergence Accelerator grants for misinformation, ocean projects /news/2021/10/01/three-uw-teams-awarded-nsf-convergence-accelerator-grants-for-misinformation-ocean-projects/ Fri, 01 Oct 2021 22:53:42 +0000 /news/?p=76057

Three separate 91±ŹÁÏ research teams have been awarded $750,000 each by the National Science Foundation to advance studies in misinformation and the ocean economy.

The for phase 1 of the Convergence Accelerator program’s 2021 cohort. The federal agency hopes to build upon basic research and discovery to accelerate solutions in two critical areas: the “” and “.”

One team, from the 91±ŹÁÏ Applied Physics Laboratory, was selected for the “Networked Blue Economy” track topic, and two 91±ŹÁÏ teams — one from the 91±ŹÁÏ Information School and another from the APL — were selected for the “Trust and Authenticity in Communications Systems” track.

Designed to transition basic research and discovery into practice, the Convergence Accelerator uses innovation processes like human-centered design, user discovery, team science, and integration of multidisciplinary research and partnerships. The Convergence Accelerator, now in its third year, aims to solve high-risk societal challenges through use-inspired convergence research, according to NSF.

The three projects that teams from the 91±ŹÁÏ will lead include:

  • The “” project, from the APL and industry partners, will produce a flexible proof-of-concept technology to help people evaluate the source of information and its reliability. Drawing on the fields of technology development, law, business, policy, curriculum development, community management, interdisciplinary research and finance, the team will develop tools and components to generate and communicate digital “trust signals” in various settings. The result will be a proof-of-concept for a verified information exchange that would support tools that users can deploy to assess the trustworthiness and authenticity of digital information. Workstreams are anticipated to include food system safety and security, bank and financial information systems, public health information systems, academic publication and supply chains. , a principal research scientist at the APL, is the lead investigator.
  • The “” project team, composed of a multidisciplinary set of researchers from the 91±ŹÁÏ, the University of Texas at Austin, Washington State University, Seattle Central College and Black Brilliance Research, will plan, facilitate and assess a series of seven workshops focusing on critical reasoning skills, the psychological and emotional aspects of information, and broader sociocultural dimensions of trust in information ecosystems. The workshop series will be hosted in collaboration with a diverse group of local stakeholders in Washington state and Texas, including urban and rural libraries, news outlets, civic organizations, and underrepresented communities. , an Information School associate professor and 91±ŹÁÏ co-founder, is the principal investigator on the project.
  • In the “” project, three new community-run ocean sensors will provide Indigenous coastal communities with real-time data on the changing ocean environment. The floating systems, anchored to the seafloor, will be deployed in collaboration with coastal communities in Alaska, the Pacific Northwest and the Pacific Islands. Sofar Ocean’s existing buoy systems — designed to be affordable and convenient — can measure waves, sea surface temperature, cloudiness of the water, and water depth, and come equipped with solar power, satellite communication and potential for expansion. The project housed under will be done through the 91±ŹÁÏ-based as well as its counterparts in Alaska and the Pacific Islands, which have long-standing, trusted relationships with Indigenous and coastal communities. , an oceanographer at the APL and the director of NANOOS, is the lead investigator.

Additionally, Assistant Professor and Associate Professor , both in the 91±ŹÁÏ Paul G. Allen School of Computer Science & Engineering, are co-principal investigators on a team, led by the international grassroots community . That team aims to develop practical interventions to help individuals and community moderators analyze information quality, including misinformation, to build trust and address vaccine hesitancy. Zhang also is on another , based at the University of Michigan, that will help media platforms determine how to flag articles that contain misinformation.

During phase 1, each 91±ŹÁÏ team will engage with the other members of their cohort in a fast-paced, nine-month hands-on journey, which includes the program’s innovation curriculum, formal pitch and phase 2 proposal evaluation. The program’s team-based approach creates a “co-opetition” environment, stimulating the sharing of innovative ideas toward solving complex challenges together, while in a competitive environment to try and progress to phase 2.

At the end of phase 1, each team participates in a formal pitch and proposal evaluation. Selected teams from phase 1 will proceed to phase 2, with potential funding up to $5 million for 24 months. Phase 2 teams will continue to apply Convergence Accelerator fundamentals to develop solution prototypes and to build a sustainability model to continue impact beyond NSF support.  By the end of phase 2, teams are expected to provide high-impact solutions that address societal needs at scale.

Launched in 2019, the NSF Convergence Accelerator program builds upon basic research and discovery to accelerate solutions toward societal impact. Using convergence research fundamentals and integration of innovation processes, it brings together multiple disciplines, expertise and cross-cutting partnerships to solve national-scale societal challenges.

]]>