Para quem gostar desta entrevista com a Barbara, recomendo o filme Hidden Figures, que também conta a história de grandes mulheres que revolucionaram o cenário científico de toda uma época, mas que, infelizmente, não tinham o devido reconhecimento.
* * *
Barbara Liskov pioneered the modern approach to writing code. She warns that the challenges facing computer science today can’t be overcome with good design alone.
Barbara Liskov invented the architecture that underlies modern programs. “Designing something just powerful enough is an art.”
Good code has both substance and style. It provides all necessary information, without extraneous details. It bypasses inefficiencies and bugs. It is accurate, succinct and eloquent enough to be read and understood by humans.
But by the late 1960s, advances in computing power had outpaced the abilities of programmers. Many computer scientists created programs without thought for design. They wrote long, incoherent algorithms riddled with “goto” statements — instructions for the machine to leap to a new part of the program if a certain condition is satisfied. Early coders relied on these statements to fix unforeseen consequences of their code, but they made programs hard to read, unpredictable and even dangerous. Bad software eventually claimed lives, as when the Therac-25 computer-controlled radiation machine delivered massive overdoses of radiation to cancer patients.
By the time Barbara Liskov earned her doctorate in computer science from Stanford University in 1968, she envied electrical engineers because they worked with hardware connected by wires. That architecture naturally allowed them to break up problems and divide them into modules, an approach that gave them more control since it permitted them to reason independently about discrete components.
As a computer scientist thinking about code, Liskov had no physical objects to work with. Like a novelist or a poet, she was staring at a blank page.
Liskov, who had studied mathematics as an undergraduate at the University of California, Berkeley, wanted to approach programming not as a technical problem, but as a mathematical problem — something that could be informed and guided by logical principles and aesthetic beauty. She wanted to organize software so that she could exercise control over it, while also making sense of its complexity.
When she was still a young professor at the Massachusetts Institute of Technology, she led the team that created the first programming language that did not rely on goto statements. The language, CLU (short for “cluster”), relied on an approach she invented — data abstraction — that organized code into modules. Every important programming language used today, including Java, C++ and C#, is a descendant of CLU.
“One advantage to being in the field so early was that great problems were sitting there. All you had to do was jump on them,” said Liskov. In 2008, Liskov won the Turing Award — often called the Nobel Prize of computing — for “contributions to practical and theoretical foundations of programming language and system design, especially related to data abstraction, fault tolerance, and distributed computing.”
Quanta Magazine caught up with Liskov at her home following the Heidelberg Laureate Forum — an intimate, invitation-only gathering of computer scientists and mathematicians who have earned the most prestigious awards in their fields. Liskov had been invited to Heidelberg but needed to cancel a few weeks before the forum for personal reasons. The interview has been condensed and edited for clarity.
You came of age professionally during the development of artificial intelligence. How has thinking about AI and machine learning changed during your career?
I did my Ph.D. with John McCarthy in AI. I wrote a program to play chess endgames. John suggested this topic because I didn’t play chess. I read the [chess] textbooks and translated those algorithms into computer science. In those days, the perceived wisdom was to get the program to act the way a person would. That’s not how it is now.
Today, machine learning programs do a pretty good job most of the time, but they don’t always work. People don’t understand why they work or don’t work. If I’m working on a problem and need to understand exactly why an algorithm works, I’m not going to apply machine learning. On the other hand, one of my colleagues is analyzing mammograms with machine learning and finding evidence that cancer can be detected much earlier.
AI is an application rather than a core discipline. It’s always been used to do something.
Were you more interested in it as a core discipline?
Honestly, AI couldn’t do much in those days. I was interested in the underlying work. “How do you organize software?” was a really interesting problem. In a design process, you’re faced with figuring out how to implement an application. You need to organize the code by breaking it into pieces. Data abstraction helps with this. It’s a lot like proving a theorem. You can’t prove a theorem in one fell swoop. Instead, you invent some lemmas and you decompose the problem.
In my version of computational thinking, I imagine an abstract machine with just the data types and operations that I want. If this machine existed, then I could write the program I want. But it doesn’t. Instead I have introduced a bunch of subproblems — the data types and operations — and I need to figure out how to implement them. I do this over and over until I’m working with a real machine or a real programming language. That’s the art of design.
Early computers used punch cards to input both programs and data.
Knowing methodology doesn’t mean you’re good at designing. Some people can design, and some people can’t. I never felt that I could teach my students how to design. I could show them design, explain design, talk about data abstraction, and tell them what’s good and bad. With too many bells and whistles, it gets complicated. With too few, there are inefficiencies. Designing something just powerful enough is an art.
If you had a magic wand and could guide the development of computer science moving forward, what would that look like?
I’m very worried about the internet. We have a huge set of problems, including fake news and security issues. I’m worried about the divorced couple in which the husband publishes slander about the wife, including information about where she lives. There is terrible stuff going on. Part of this grew out of an attitude in the ’80s. In those days, we were 15 universities and a couple government labs connected by an internet. We were all buddies. The attitude was that sites shouldn’t have responsibility for content. It would stifle their development. You see that attitude has continued.
Was this thinking an extension of academic freedom?
No, it was pragmatism, without any understanding of where we’d end up. If they took on policing, they would have had to think about sticky issues. They went into it without adding safeguards. More than technology is needed to solve our current problems. We need laws addressing the ways people misbehave. We need to work out this question of privacy versus security. Some of it’s technical. For example, Facebook has an algorithm for how it spreads information. They could spread information more slowly or recognize what information shouldn’t be moving. Societies always have trouble dealing with something new. We can hope we mature. But if I had a magic wand, I’d make all that go away.
Talk to me about your personal journey as a woman in computer science.
I was encouraged to do well in school. I don’t know that my mother overtly encouraged me, but she didn’t get in my face and say, “Oh no, this is a bad thing to do.” I took all my math and science courses, which girls were not encouraged to do. At Berkeley, I was one of one or two women in classes of 100. No one ever said, “Gee, you’re doing well, why don’t you work with me?” I didn’t know such things went on. I went to graduate school at Stanford. When I graduated, nobody talked to me about jobs. I did notice that male colleagues, like Raj Reddy, who was a friend of mine, were recruited for academic positions. Nobody recruited me.
Back then, advisers placed graduates through deals with departments around the country.
Yes, but nobody made deals for me. In the ’90s, I went back to Stanford for a department celebration. A panel of the old professors, without knowing what they were doing, described the old boy network. They said, “Oh, my friend over there told me that I’ve got this nice young guy you should hire.” It was just how it was. They were clueless. They talked about a young woman who did so well because she married a professor! Clueless. Another colleague had a pinup in his office. I asked him, “What’s that pinup in your office?” Clueless.
I had applied to MIT, but they would not consider me for a faculty position. When that happens, you think: “I’m not good enough.” You can’t help it. But I also thought, “Computer science is wide open.” My industry job at Mitre was a good research job. There, I worked on programming methodology and produced research that got me my first prize paper. Then in 1971 I gave a talk, after which Corby [Fernando Corbató] invited me to apply to MIT. I was also invited to apply to Berkeley. Things were changing.
Even so, is it correct that there were approximately 1,000 faculty members when you started at MIT, only 10 of whom were women?
That was my recollection.
So there was progress, but …
Title IX wasn’t a law yet, but pressure was building. MIT President Jerry Wiesner was pushing. Pressure must come from the top. It doesn’t bubble up from the bottom. There were a number of distinguished women at MIT who weren’t on the faculty. Around then, several of them were invited to join the faculty all of a sudden. Of course, math never had any. Math is really bad.
My sense is that all scientific fields have failed to recognize some foundational contributions by women.
In the 10 years before I was head of computer science at MIT, the department identified only one woman worth hiring. When I was the head [from 2001 to 2004], I hired seven women. We didn’t scrape the bottom of the barrel. All three junior women I hired are outstanding. There was a long period of time where women were not considered at all.
After you won the Turing Award, a comment appeared online saying, “Why did she get this award? She didn’t do anything that we didn’t already know.” This dismissive comment may or may not have had to do with the fact that you are a woman.
Oh, I bet it did! There was another comment — one I never tell about — that said, “Oh, she didn’t do that work. [A male colleague] did it instead.” That was total nonsense. I wasn’t looking at the comments. My husband was. These were a couple he resurrected. I sometimes gave talks where I got hostile questions, but you have to be prepared for that, whether because I was a woman or because people are trying to show up, you know …
Show up a Turing Award winner?
Yes! I didn’t realize then that some people in my department had my back. And by the time I was traveling around, I was already really well known. But that’s the mystery: Why are some women able to persevere?
Do you have insight for emerging women scientists? Is there some sort of Teflon that women may apply to prevent discrimination or harassment from sticking?
It’d be nice to know how to apply that Teflon. It wasn’t until I had been at MIT for a while that I lost my inhibitions to ask questions in public. It took a long time to develop that self-confidence.
It’s delicate. Your story reveals an undercurrent of “lie low until you can really stand tall, and then embrace that.”
Yes, maybe that was my strategy. That, together with a lack of a need to please people. Women are socialized to please.
That’s concrete advice: Let go of the need to please.
You know things are not really better now than they were then. Maybe I was lucky. If I had gotten married right out of college, I probably would have ended up in a totally different place.
Do you really think that? Your contributions have transformed computing and society.
You know, you follow this crooked path, and who knows?
Nenhum comentário:
Postar um comentário