d.school launches 4-part workshop series on machine learning and blockchain

Oct. 14, 2018, 7:43 p.m.

On Friday, Stanford’s d.school held the first workshop in a four-part series examining the fundamentals of machine-learning analysis and blockchain systems. The workshops, which are available to both students and the general public at no cost, run from mid-October to late November with the goal of exposing a wider community to rapidly developing technologies.

“At the d.school, we are really interested in this intersection between design and emerging technology,” said d.school Director of Teaching and Learning Carissa Carter. “We believe strongly that we need to provide radical access to that.”  

The first workshop was a collaborative seminar focused specifically on machine learning, a form of automated data analysis by which computers can teach themselves to problem-solve without exhaustive programming. Combining collective discussion, brainstorming and reflection, Friday’s group not only broke down the intricacies of artificial intelligence and data configuration but also examined the technologies’ broader social impacts.

For Carter, examining how machine learning innovations spill over into civic discourse is crucial to the goals of the workshop. Viewing technology solely through a scientific lens, she said, overlooks the often unintended and broader repercussions of software impacts.

“If you are making policy, building something as an engineer or making a medical device and putting them in the world, you have to understand that full spectrum from the data that goes into it all the way out to the implications it might have,” Carter said.

Andy Chamberlin, a life sciences technician and project manager at Stanford’s Hopkins Marine Station, attended the workshop alongside three colleagues to better understand the technology behind their own lab’s research. The team intends to couple machine learning with pathogen images, streamlining the research processes.

“It seemed like a good way to think about machine learning with a fresh perspective,” he said.

Mmakgantsi Mafojane, a policy researcher working in the energy access field, participated in the workshop to acquire more knowledge on the potential of machine learning to provide more efficient energy models.

Given her legal background, Mafojane handles negotiations involving the technology’s application to various software such as payment platforms, mapping systems and communication networks. She said she is worried about the ethical ramifications of machine learning.  

“There’s a concern of things getting out of hand,” Mofajane said. “There are ethical boundaries. How do you build ethical checks and balances into machine learning processes?”

To “build inclusivity into products and the systems built around those products,” she added, would be the ultimate objective.

According to Carter, machine learning also has limitations. The “learning” portion of the technology remains dependent on a body of data. And when the dataset is limited, machine learning software can fall short — for example, in 2015, Google Photos came under fire after the program’s machine learning algorithms, used to sort through hundreds of thousands of images, began to label one user’s photos of black friends as gorillas. The blunder resulted from the biases in the software’s machine learning processes, which most likely lacked adequate data from black consumers.

Chamberlin said the workshop re-wrote many of his previous assumptions concerning machine learning. While he encouraged technological development, he also said unchecked advances can have consequences.

“It was beneficial to keep all those things in mind,” he said, “rather than just thinking, ‘Can I get this project to work?’”

Carter said she feels that such analysis is lacking in today’s world.

“It’s just as important for someone who is an earth scientist or a humanist to know what new tech is doing,” Carter said. “They will be able to offer really interesting ideas [as] to how it might be used that someone who has a tech background just doesn’t see.”

The event aimed to create a space for those kinds of people to engage in machine learning, an Carter  engagement that Carter feels drives inclusive innovation.

“In order for these technologies to represent us, we all have to be a part of making them,” she said. “I want to get us one step closer to that.”

 

This post has been updated to clarify that Andrew Chamberlin works at the Stanford Hopkins Marine Station, not the Stanford John Hopkins Marine Station. The lab was named after Timothy Hopkins, and has no connection to John Hopkins. The Daily regrets this error.

Contact Ricardo Lopez at rclopez ‘at’ stanford.edu.

 



Login or create an account

Apply to The Daily’s High School Winter Program

Applications Due NOVEMBER 22

Days
Hours
Minutes
Seconds