Rachel Thomas, Founder of fast.ai & Assistant Professor at the University of San Francisco

Dr. Rachel Thomas is the founder of fast.ai, a non-profit research organization which aims to make AI more accessible by offering free classes and a large resource library. One of her courses, “Practical Deep Learning for Coders” has enrolled over 100,000 students and her writing has been translated into Chinese, Korean, Spanish, and Portuguese.

As a researcher and assistant professor at the University of San Francisco, she helped develop the world’s first open access deep learning certificate program and has taught graduate-level numerical linear algebra. She also serves as an advisor for Deep Learning Indaba, a non-profit which aims to spread and improve machine learning in Africa. Additionally, she was a senior instructor and cohort manager with San Francisco’s Hackbright Academy, a selective boot camp for women who want to become software engineers. Notably, Dr. Thomas was an early engineer at Uber and was recognized among the Forbes 20 incredible women in AI. She earned her PhD in mathematics from Duke University.

Dr. Thomas generously gave 30 minutes of her time to OnlineEducation.com. Her interview has been lightly edited for length.

Interview Questions

[OnlineEducation.com] For those of us who haven’t had a chance to experience your free MOOCs at fast.ai, how would you explain deep learning in layperson’s terms?

[Rachel Thomas] Deep learning refers to a specific class of algorithm that are also known as “multi-layered neural networks.” These algorithms are getting amazing, world-class results on a variety of problems. For example, identifying what object is in a picture, translating text, or understanding speech. It’s used in voice recognition as well.

[OnlineEducation.com] A lot of important applications. Who are the mentors who got you interested in this field?

[Rachel Thomas] I don’t think of these people as traditional mentors, but I’ve learned a lot from them and have been encouraged by them. One is my co-founder Jeremy Howard. I’ve learned a ton working with him in how he solves problems both technically in coding and in business.

I had some excellent math professors at Swarthmore College. Janet Talvacchia and Cheryl Grood were very encouraging and gave me a lot of helpful advice about approaching graduate school.

And in grad school, I had a friend Carolyn who finished her PhD a year or two ahead of me. She was super encouraging and helped me get across the finish line. It was definitely a tough haul! She’s now a physics professor.

[OnlineEducation.com] You must have had a lot of encouragement because you’re incredibly accomplished in your field. In addition to co-founding fast.ai, you worked as an engineer at Uber and were named as one of the Forbes “20 incredible women advancing AI research.” Considering all of your success, what would be your dream discovery or moonshot? What problem keeps you up at night that you would really love to tackle?

[Rachel Thomas] It’s the problem I’m working on with fast.ai: making deep learning easier to use and getting it into the hands of people that wouldn’t normally have access. One of our students that just finished our course is a Canadian dairy farmer. He said he needed to learn how to use AI to improve the health of his goats’ udders. I thought that was such an awesome application and something I never would have thought of! I want every dairy farmer and people no matter their profession or where they are in the world to be able to use this technology on the problems that they care about.

This is a relatively young field. An area that I’m really interested in is called “transfer learning,” which allows people with small datasets to use deep learning. We actually have examples of people getting awesome results with small datasets using networks that were trained on a large dataset for a similar problem. Sometimes, even if the problem is different, the network can be modified for a new problem. There’s still a ton of research that needs to be done to make the networks easier to train and less fragile.

[OnlineEducation.com] What are some of the most exciting applications of deep learning for you?

[Rachel Thomas] Medicine is definitely a huge one. It’s important to remember that there is a global shortage of doctors that is quite large in some countries. For instance, in Nigeria, it would take 300 years to train enough doctors to fill the shortage with their existing pipeline. I know a pediatric radiologist from South Africa who said there are only 14 pediatric radiologists for the entire African continent. My hope is that deep learning will help doctors and community health workers be more effective.

To take another industry, there are people working on robots and sensors that would allow farmers to use fewer pesticides. If you can recognize a weed or a bug and have a robot pluck it off, you can really increase agricultural productivity with less harmful side effects.

[OnlineEducation.com] What are the major challenges to making AI as sophisticated as the human brain? Are we even close?

[Rachel Thomas] I do not think we’re close. That is a separate pool of research and often involves a lot of philosophical questions such as what is human consciousness or what does it mean to be human. That’s less my area.

What appeals to me are these really practical applications where the goal is not to emulate a human brain, but to do a specific task well: for example, identifying if something is cancerous or recommending the best course of treatment. Those applications aren’t necessarily about replicating a human brain as much as how do we get healthier patients or a better outcome.

[OnlineEducation.com] Related to that, because AI and robots are increasing the automation of traditional industries, have you given any thought to how we’re going to combat the displacement of workers in the future?

[Rachel Thomas] This is something that really scares me. Inequality is already a huge issue and I think that AI and automation are going to accelerate this by concentrating wealth into the hands of the few.

I am in favor of policies like a negative income tax, which would basically pay people extra money to work if they’re not earning a livable wage. That would incentivize people to continue working as wages drop. It would require a redistributive act of the government in favor of a basic income.

[OnlineEducation.com] I want to move to talk about the gender disparity in technology and AI. I know you’ve given some thought to this in your excellent Medium posts. What are the demographics of your industry and are you typically one of the only women in the room?

[Rachel Thomas] That’s a great question. The field is overwhelmingly male, and also white. It’s lacking in racial diversity, as well. For tech, the attrition rate is really high. In terms of percentages, over twice as many women drop out of tech as men do. These are women who are qualified and work in the field and then leave in many cases because of the hostile atmosphere.

There are also a lot of studies on how much harder it is for women to progress in their careers—to get promoted and recognized for what they’re doing. You see that the shortage of women and people of color is even more extreme when you look at the higher echelon in a company.

With fast.ai, we have a number of diversity scholarships for our in-person course. That’s really helped us curate a much more diverse class with women and people of color.

[OnlineEducation.com] Why do you think that women and other groups are still underrepresented?

[Rachel Thomas] I think that there is a lot of unconscious bias. This often compounds problems, making the environment less pleasant to work in and making it harder for women and people of color to advance. It comes in at all levels. I think it’s really pernicious because people working in tech often pride themselves on being super logical and rational. That can make it even harder to recognize our biases.

There are also studies showing how an idea is received from a man versus from a woman. A man and a woman pitch the exact same idea to different groups of study participants. The idea is perceived as more logical and more persuasive from a man than when a woman presented it.

In tech, there’s also a bit of a crisis in manager training and company processes. A lot of tech startups are very chaotic and I think there’s been this backlash against MBAs, so you have a lot of people structuring companies who haven’t given much thought to leadership or management, which are very specific skill sets.

[OnlineEducation.com] Yes! Especially when there’s this “move fast and break things” mentality. Thanks for bringing that up. That’s an important point. What do you think are the risks of having only one demographic shaping the future of AI?

[Rachel Thomas] There are a lot of risks. One, there’s the bias that can get baked into these algorithms. ProPublica has done a great job of reporting on this issue. There was this software that was used to predict whether a prisoner would be a criminal recidivist and commit another crime. It was found to be very racially biased.

We also see products. In 2015, Google Photos labeled some black people as gorillas. Another tool, Google’s Word2Vec, came up with these sexist analogies like “Man is to computer programmer like woman is to homemaker.” It shows up in Google Translate, too—taking languages that have a gender neutral singular pronoun and translating sentences like, “He is a doctor” and “she is a nurse.” Basically they’re adding gender based on these stereotypes.

Products themselves can be biased, and what products actually get created is another problem. Most people are making products for problems they’ve experienced and if that’s just a narrow slice of the population, you won’t create solutions for different types of groups.

Going back to this concentration of wealth, I think a lot of people creating AI companies are going to get incredibly wealthy and that even further concentrates wealth across a single demographic.

[OnlineEducation.com] Those are all formidable challenges. One more question: given the disparities in tech, what advice do you have for women and other underrepresented groups who want to get into deep learning and AI?

[Rachel Thomas] The first step is to learn to code if you don’t already know how to do that. I definitely recommend learning Python if you’re interested in deep learning, which is the main language being used. Our course “Practical Deep Learning for Coders” is all online for free with no ads. There are other resources out there as well.

Then, a practical approach is to get in there right away and start working on problems you’re interested in! It’s really important to practice implementing what you’re learning about.