Localization Academy

How To Make Community Localization Work – Jeff Beatty From Mozilla

Community localization means free translations… right? Not so fast. Before you make a decision, listen to Jeff Beatty – Senior Head of Localization at Mozilla.

Mozilla products are localized into 200+ languages thanks to their global community of localizers. How do you make community localization work? Learn in this episode where we also discussed:

  • How to manage quality with a crowd
  • Incentives for the community
  • Why can community localization get more expensive
  • Gamification and why Jeff hates leaderboards
  • How to make partial localization viable
  • Democratization of MT
  • Standing in a corner… laughing?!

This is episode #29 of my social interaction practice, also known as The Localization Podcast 🙂 #localization​ and #translation​ insight delivered to you by the power of voice, this time with Jeff Beatty.


Andrej Zito 

All right, Jeff, welcome to the podcast. It’s a pleasure to have you here.

Jeff Beatty 

Thanks. Thanks for having me.

Andrej Zito 

How are you doing, man?

Jeff Beatty 

As well as can be.

Andrej Zito 

Yeah, you just showed me that something happened with your leg, right? Last week?

Jeff Beatty 

Yep. I was longboarding and gotten an accident. Now I get to use a crutch.

Andrej Zito 

Yeah, you get to stay home and work from home. Right, which you couldn’t do before.

Jeff Beatty 

I’ve been working from home since I for the last like nine years.

Andrej Zito 

Yeah, you didn’t even need COVID right for that.

Jeff Beatty 

Right? It does make my commute harder, because I have to go downstairs and upstairs. So doing that with a crutch. Like me, it’s much harder now than it was pre COVID.

Andrej Zito 

Yeah. You mentioned that you’ve been working for help from from home for the last nine years. And if my information is correct, you’ve been also working with Mozilla for the last nine years. Is that oincidence?

Jeff Beatty 

Ah no, that’s my job. Previous to working for Mozilla, I reported into an office every day. They had a strict no working from home policy. My my laptop stayed at the office. At the end of the day, I wasn’t allowed to take anything with me. And then when I went to Mozilla was my first experience with a full time remote position. And I worked on a team that was completely remote and distributed all throughout the world.

Andrej Zito 

Does Mozilla have any HQ somewhere?

Jeff Beatty 

Yes. In San Francisco in Mountain View.

Andrej Zito 

San Francisco. Right. Okay, so just for the folks, you are the senior head of localization at Mozilla. Is that correct?

Jeff Beatty 

That is correct.

Andrej Zito 

So I’m wondering, how did you start? How did you get into localization?

Jeff Beatty 

I got into localization because so as I was telling you a little earlier, 15 years ago, I was I started a mission for the Church of Jesus Christ of Latter Day Saints, or what were more commonly known as the Mormons. It was a two year, full time evangelism evangelism mission, where I was tasked to go out to Spanish speaking populations, and preach, effectively preach Christian principles. And yeah, in the process of that, I came to really believe that linguistic accessibility and digital access, were critical. I saw people try to use phones or try to use computers. And they weren’t using them, they weren’t being effective, they often just tried to avoid it, because they didn’t understand it. And it wasn’t speaking to them in their language. On top of that, I was one of the few missionaries and within my cohort that really understood that if I could speak Spanish, as close to natively as possible, I would get more positive reactions and responses from people that I spoke to many missionaries, they would study the language for maybe three months, and then say, Yeah, I can get by and move on. I studied it every day, for the full two years of my mission. And because I truly believe people deserve to hear information in their language. So then, when I came home, I went back to university found out that there was a Spanish translation major, tested into that, as part of the translation, undergraduate coursework, I learned about localization, and was absolutely fascinated by the idea of localization. So I started a minor in computational linguistics, that included some illusions, some courses that kind of alluded to localization. And then all that time, I was a freelance translator, and I was translating UI content for a couple of different clients. Then I got hired as a technical writer at the company. I was at Previous to Mozilla where not only was a technical writer, but I was also the localization manager. So I learned a lot about localization just in that role, and on on the job, and spent so much more of my time being a localization manager than I did, being a technical writer. So that that kind of was a nice catalyst, I’d already had the translation background, I’d had the professional experience of being a translator, being localization manager was really cool. And that’s, that’s really how I got into it. It was it was planned, as planned, and when I had opportunities, I took advantage of them. I came over to Mozilla first as a localization project manager. Then I went to the University of Limerick, and did the graduate Master’s Course, and multilingual computing and localization, I was actually part of the first cohort that did that remotely, or as a distance program, that we did have to go and spend some time at the end of that program in Ireland in Limerick. But for the most part, it was it was distance learning. And so I graduated from that program. And so just continued to work in localization since.

Andrej Zito 

What did the education gave you at that point? Since you were already working full time, and you were in the field doing actual projects? What did you learn?

Jeff Beatty 

There was a lot about project management, that because I was lacking the formal background, I didn’t know how to talk about project project management, I didn’t know the formal, really the formal processes, what it was, is that I was just a highly organized person. And so that high, that helps skill set, like came into play, and nobody cared whether or not I really, truly knew what project management was, they just knew that I was very detail oriented and highly organized. And so they could give me assignments from the beginning and and trust that I was going to go about accumulating requirements, check, validating assumptions, and all of that. I didn’t learn until this program, what that was all of that fancy terminology that I just used to describe some project management pieces, or like risk assessment, risk analysis. I didn’t know any of that. I had done it before. I had no idea that that’s what I was doing. So that’s kind of what that that served. There were a few other things where I thought, yeah, I’ve been doing this for forever. I didn’t know that. That’s what it was called. Oh, yeah. And I didn’t know that. It’s that it fit into the software development lifecycle. In this particular way, even though I’ve seen it. Like my only experience was at this company, where they didn’t exactly do localization the right way. And I’m talking about the one before Mozilla. And so like sitting there and learning it, I was able to look back and realize, oh, wow, I was doing things completely wrong. Even though I was doing things that are localization oriented, like this should have happened earlier in the process. This should have happened later, the master’s program gave me that more holistic view. On top of that, it opened up opportunities for me that were previously closed. I currently teach as an adjunct professor at Brigham Young University, and I teach computers to sit translation tools. If I didn’t have a master’s degree, I wasn’t qualified to teach that course. I also think that if I didn’t have a master’s degree, I wouldn’t have qualified myself to become head of localization. So it served it served a few different purposes. All very helpful.

Andrej Zito 

When we had our first chat, we agreed together that we will center this interview around community localization and a couple of the tools that Mozilla has developed and is in charge of to help with the community localization, and the open source pneus of localization. But I’m wondering, when you first joined Mozilla, how did the localization look like when you started as a pm?

Jeff Beatty 

When I when I started at Mozilla, it looked very different from what it looks today. I’m not sure how familiar you are with open source culture. But it’s very hacky. It’s very heavy. I’m sure you’ve heard the term bootstraps, like pulling yourself up by your bootstraps. It’s very bootstrap in terms of culture, and so Mozilla had a localization program that reflected the hacky bootstrap Enos factors that were are predominantly found within open source culture. So there was no centralization of localization, there were various tools that different communities were using to manage localization, you had to be highly technical, because you needed to know how to utilize version control systems as a translator, and not just like the common ones like subversion or git, you need to understand Mercurial, which is a little more niche, in terms of version control systems. So we would receive contributions both directly through people interacting with version control and submitting translations, they would clone an English repository, they would run a script, that would tell them where there is new content to translate. And then they would manually go in, copy that new string and paste it into their locale, and translate the string there and commit it back to the repository. It was an extremely painful process. And only people that really adhere to that hacker, like I’m going to, I’m going to do things, I’m going to do hard things the hard way, because that’s cool. wood, wood would be successful, so only way they’d be successful. On top of that, we had like three or four different tools out there that were that were doing different things for different communities. So it was very decentralized. And that made it very difficult to validate contributions that made it difficult to ensure quality contributions. So between then and now, we’ve really streamlined and centralized a lot. It’s a very different landscape now than what it was when I started.

Andrej Zito 

Is that what you immediately felt when you first joined Mozilla, that things could be running much better and much smoother? Or was it a team effort was like behind the the update of your localization program and how things are running Mozilla.

Jeff Beatty 

I’d like to think that all of us had an inclination that things could be better. What really catalyzed change was the when Mozilla began expanding beyond the Firefox desktop browsers, or the Thunderbird email client or seamonkey web browser, when we really began to look at opportunities and different platforms. That’s when we began to see the pain points more clearly, in the process that we had developed, or that had been developed for Firefox, and realize that things needed to change, if we were going to be able to continue to meet the needs of the engineering organization. I would like to think that I was one of the main proponents, behind those changes advocating for those changes. I in terms of like formal background, I had had the most experience and knowledge about localization specifically, of the be individuals on my team. They all very well knew the Mozilla way of localizing. And that was, their experience with localization was often restricted to either their experience contributing to open source projects, or working for Mozilla. beyond that. It was kind of a black box, and they worked to reinvent the wheel in many ways. So I think the combination of this this increase in demand to localize across multiple platforms, combined with my career, prior experience, and knowledge and localization, and how we had done that, at different companies in the past, or how I’ve done that different companies in the past, helped to spur on some change. And then when I became responsible for all of localization, like that’s, I think, really, when things when change picked up. Because there was a lot of need to change at the time. And I was that I was definitely the proponent like the catalyst behind that change.

Andrej Zito 

Mm hmm. So when you eventually became the head of localization, what was the first thing that you decided to change?

Jeff Beatty 

Do you mean- Do you mean, how did I make people the most angry? The first thing that we changed was we created a plan to to what’s the term central? We’ll just go to centralize, they came up with a plan to centralize the funnel for contributions. So we had decided we were not going to accept contributions that did not come through Either this tool, or this tool. If and that’s that’s just it. A lot of members of our community were were not thrilled with that idea. They, that disrupted their own workflow, they were used to the way that things were done. Some left, as a result, others pushed through and persevered and realized, this actually is probably a better scenario for me. They saw improvements because they saw more people could be involved, we lowered the technical barrier to entry for them to be involved, they saw that they didn’t have to worry about more technical tasks that they had been responsible for in the past, thanks to automation. And so we went over a lot of people, we also lost some people in the process, just because change, you know that that’s kind of the nature of change. People look at that. And they say, Oh, this is scary, I hate this. Or they say, Okay, I can I can figure this out and move forward. So that was probably that was probably the biggest change, saying no more, no more contributions directly to version control. And if you are going to localize for us, it’s either through this tool, or it’s through this tool, none others, we are not going to set up others. And gradually that consolidated down to one central hub for all community localization.

Andrej Zito 

Mm hmm. And that is Pontoon.

Jeff Beatty 

Yep, that’s correct.

Andrej Zito 

Before we get into the tools, you already mentioned the people, some of them left. Some of them came when you made it more accessible. And one of the things that I’ve noticed in your profile is that you are a localization community builder. I think this is a key for Mozilla, right, if you’re localizing your products, mainly through the power of the community. So I’m wondering at that point, you decide like, okay, we need to change things, I’m probably going to piss off some people, but I need more people. How, how do you create a community? How do you find them? Is there something like a recruitment process, or everybody can volunteer? How do you go about building a community?

Jeff Beatty 

So we’ve taken several approaches to that over the years, because you start with one approach, you see where it succeeds, you see where it fails, and then you iterate and try to try to move on. There was a time when we were actively going out to communities and asking them to participate and be involved. The problem we just we identified in doing that was that we were asking individuals and communities to care about something that they didn’t previously care about. And that could be very hard, especially when you’re trying to motivate them to translate a lot of content.

Andrej Zito 

When you start- so when you say they didn’t care about are you saying that they were not behind open source philosophy, or they didn’t use Mozilla’s products.

Jeff Beatty 

Some of them didn’t use Mozilla’s products, some of them didn’t understand or know what open source was. Some of them also just didn’t think that localization for their language was was important. Like I recall, having conversations with several Native American communities, about being involved in open source and participating and how that’s a great way to create immersive experiences for their youth, you know, creating the need for them to use their language by providing them with software that an apps that speak to them in that language. And that it would be a critical tool for revitalization. But they were very much focused on education. And so they didn’t have time or didn’t see the value in localization. Because for them, it was all about No, we need to educate the youth need to educate, educate, educate, and that’s where all of their their resources went to. So we took that approach, we’re where we really tried to, like approach communities about being involved. We’ve since taken a more passive approach, where we have opened the doors and trust that people will come to us when they are ready. When they have discovered us, they discover what it is that we’re trying to do, then and what role they can have in that and how they can represent their language. Then we bring them in because they’ve already bought into the mission at that point. they’ve bought into what we’re trying to do. They like the products, they love their community, they love their culture and their language and they want to do something at that point. It’s really a question of taking that good intention and that energy and focusing it on productive and sustainable methods of contribution for the localization community. So we’ve gone from this more proactive, we’re going to do a lot of outreach to a more passive, when you come to us, we have all the resources that we can in the world to help mentor you, you to be successful with your goals, which happened to align with our goals.

Andrej Zito 

I don’t know how to explain this better. So bear with me. If you are a startup, you have this idea of your ideal customer. And that’s the person that you should be knocking on the door. Right? So how does the ideal person look like for you, that comes to you and say, hey, I want to help, like, Where did where are they usually in their life? What needs to happen to them? translators? Are they experienced users of Mozilla products? Are they like fanatics of the open source idea.

Jeff Beatty 

So it is, I think, over time, it has become more rare that we have found people or that people have come to us, because they’re fanatics of open source. The communities that come to us recently see opportunity for their language. So they are already either users of Firefox and don’t see their language there and want it represented. Or they are participants in other initiatives that Mozilla has going on. A great example is this common voice initiative, where we created a platform for people to contribute their voice to voice data sets, that can have been open source and open, licensed, and other companies and people can take and read and do academic research over those data sets, or utilize them to train machine learning algorithms for different products. And people have gravitated to that project, because they want to see their language represented in there. Mm hmm. So they become familiar with Mozilla through some avenues, either using our products, engaging in another initiative, having a friend that is already localizing products, or that is a user that is like constantly stealing their phone away from them and installing Firefox and sending it as their default browser. And then they get in the way. Ah, why didn’t Why did you do this? I say that speaking from experience when I was young, I would do that to my, my dad’s computer. I would but it was with Netscape at the time. And I would always download net and install Netscape. And he’d come over and say, Oh, this is just a virus, what do you do? So those are the kind of people that tend to gravitate toward toward Mozilla, it’s people that care deeply about their language, and see an avenue to make their language have representation in a digital ecosystem. So they’re passionate about our language, they’re passionate, passionate about digital equity, and they find opportunities with us. That often means that they could be professional translators. We’ve had primary school teachers, who, in India, who helped have classrooms where they’re teaching kids how to use software, but the kids don’t can’t use the software because it’s not in their language. And so they’re taking initiative on behalf of their students to make that software available to them. And the language they understand. I think the vast majority of people that are contributing are our engineers. They are they’re already developers, or they’re involved in in voice technology, or not necessarily voice technology, but language technology. To some extent. It’s been a while since we’ve done a profile survey, we should probably run that, again, let’s see where people are coming from.

Andrej Zito 

Once the people join in, is there any way that you interact with them? Outside of giving them the tools to do what they want to do? Do you somehow try to direct them in a certain way?

Jeff Beatty 

Absolutely. So pontoon already comes equipped with some good methods to help them focus their, their limited time with us, because we acknowledge that they don’t have all the time in the world. And we also have not paid them for the privilege of their time. They are doing it out of the goodness of their heart. So we want to help focus them toward projects that are high of highest impact. So there’s a star prioritization system in place. On top of that, we have developed a meant I shouldn’t even call it a mentorship program but more of a mentorship tradition within our team of helping individuals that want to lead communities and want to see this change happen, mentoring them on how to do that effectively, and how to do that in a scalable way. So that even whether they are involved in the future or not, their contributions remain and remain useful to people and impactful to people out in the world. So we’ll we’ll do a lot of one on one mentorship. There are mailing lists, where we have conversations about localization, we also have real time, chat rooms, over a matrix, and telegram, where community can reach out to us. On top of that, we create opportunities to meet with core and active localizers. In person, we will actually organize workshops in different parts of the world, and travel out to those parts of the world and bring in people that are in in nearby countries and regions, to offer trainings, to share a social experience, and to try to either achieve a common goal or really just build trust with with those those individuals.

Andrej Zito 

Mm hmm. Besides of what we already mentioned, and talked about, what are the other incentives that people could get out of this? Can they see their name somewhere, like when people use the product in their localized version,

Jeff Beatty 

If you use Firefox, you can open a new tab and type about colon credits. And in there, you will see the names of all of the individuals who have made contributions to Firefox, and have notified us of a desire to have their name listed there. We don’t do it automatically, because we recognize that there is a premium to privacy and anonymity. But if you want to see your name there, you you very much can have your name there. We also produce a monthly Elton and report where as part of that we have a spotlight section, called the friend of the lion, or friend of the aaltonen. section, where different leaders in the various communities that are localizing with us will nominate people to be spotlighted and called out for a specific thing. So we use that platform as well to give some recognition. We also try to whenever possible, send gear, like t shirts, stickers, make those available. These in person meetups can also be a form of recognition and something that is motivating to people. I know many also who have leveraged their experience localizing with us into professional work. Not just in the localization industry, but also the language industry, but also just in software in general, they build up a portfolio, they’re able to point back to that portfolio, and that opens doors to them that were previously closed. And that’s particularly useful for people in specific parts of the world that they’re more, more eager to accumulate that type of profile of profile or portfolio than in other parts of the world. I’m a firm advocate of the idea that if you are interested in breaking into the language industry, or even interested in just breaking into software, having a strong open source presence is critical. Because when you have that strong presence, you have not only a portfolio of work that you you’ve done and contributed to both projects large and small. But you also develop a network of people that can vouch for your good work. So then you come with references to the work that you are you are able to do. And I know a lot of there are some people out there who think that volunteer work doesn’t translate into professional work. I can tell you from firsthand experience witnessing this, that it absolutely does. So I’m a firm advocate and volunteering in having an open source profile or portfolio and presence in order to gain gain work. And I I’ve tried to lean more heavily into that aspects as being a motivating factor or recognition factor for people to be involved.

Andrej Zito 

Mm hmm. Absolutely. I’m just curious, very quick question. Do you in any way to utilize gamification as part of building the community in motivating people?

Jeff Beatty 

That’s a great question. And it is. It is a big question. In my book, the only gamification that we use is a leaderboard. And I hate it. leaderboards, leaderboards are worthless. In my opinion, it is so easy to gain a leaderboard system. And we’ve seen it so many times, your leaderboard is only as useful as the metric that is used to in within that leaderboard, right. So if your leaderboard is only about volume of contribution, that’s a crappy leaderboard. And it doesn’t actually tell you anything, I could go into a project and just click google translate for every string, and my volume of contribution will will skyrocket and I’ll become number one really quick. But that doesn’t speak to anything about me or about the quality of what I what I’ve contributed. So I look forward to being able to implement more detailed leaderboards, and utilize digital badges or other methods of gamification, that are based on real metrics that measure an individual’s progress and quality of contribution. And that’s really where this whole volunteer to professional thing comes into play. And where I think that open source projects can really make contributions back to those individuals that contribute to them. If you are gamifying your system in such a way that the metric that you use for leaderboards or for digital badges is reflective of progress, and skill, set, acquisition and quality, then those actually have value out in the marketplace, and you’re contributing value back to those individuals. I’ve called it kind of a cloud exchange in the past, they are giving clout to you through their contributions, you’re giving cloud back to them by saying your contributions are awesome, and we vouch for you. So I won’t go into any more detail. I have a complicated opinion about gamification. It’s only useful if you’re measuring the right things, and determining what the right things are. That’s, that’s a big gray area. And it’s it’s hard to arrive at a consensus, I think

Andrej Zito 

We’ve touched on the area of quality a little bit, which I think is something very important, especially if you allow anybody to contribute. So my question is, how do you manage quality, or if there’s actually something like that happening in community localization as management.

Jeff Beatty 

So we manage quality, in very similar ways that you would see anywhere else. We utilize translation memory, we utilize term basis, in order to use have some method of of ensuring consistency, across translations. We also have community generated style guides that are used to train new people that come. They’re also used as the ruler against which to measure the quality of a contribution. We also within pontoon have, effectively three different sets of rights. for volunteers. We have contributor level rights, which means that, that that’s basically the base level, anybody that creates an account and pontoon as a contributor, you can submit a translation suggestion that must go through peer review before it actually lands in our in our product repositories. Then you have translator access, which is the rights given to people to be able to just translate freely, and have what they translate end up in the repositories. It also gives them the rights to review pending translation suggestions from other people. And then finally, you have manager rights, which gives people the ability to manage the access for each user that is contributing to that particular language. It also gives them the ability to review pending translations to submit their own translations. So there’s there’s this autonomous vetting process that we’ve established in each community that is at the discretion of the managers and the translators. And as a as a contributor, learns the rules, according to the style guide that has been produced by that community, then they can also advance to the point where they can translate and we can help Have confidence that the translations they gave will be of good quality. And in accordance to that rule set. All of this being said, I don’t want to give the impression that we’ve solved quality, quality control for for community environments. If you as an organization are going to embark on the journey of a community localization model, one of the very first concessions that you have to give is acceptance criteria. And that could mean that you don’t pay as close attention or that you allow linguistic errors to slip through. Because they are just going to, honestly, and if I’m being bluntly honest, professionals make linguistic errors, too, of course, and they’re not always caught. And they’re getting flack for it. And yeah, right. So you, you kind of have to lower your acceptance criteria for what you will ship that could be in measuring language quality, it could be in I guess, setting that barrier at Don’t break my builds, and then I’ll ship you. For us. Another factor has been localization completion. So we’ve established a system where it doesn’t matter whether the localization is at 100%, complete for version 58 of the release it if you have provided us with an update, no matter how small to your localization for that version, then we’ll ship it. So we are shipping partial localizations all the time. Because we have to we are at the discretion of have a community of volunteers. And as such, we have no right to impose obligations on their time. Instead, we accept what they are graciously willing to give us. And we make it available to users as quickly and immediately as possible. So like for us our own internal like quality metrics. We trust the community to sort through the language, the language side of it. But our acceptance criteria is don’t break our builds. And just give us something that that is fit. And we’ll ship it.

Andrej Zito 

Were are you always in favor of the partial localization? Or did you have a different model? Before that doesn’t mean that when I’m using some very rare language, I could see in one screen of Firefox, one string in English and one string in my local language.

Jeff Beatty 

You you’ve asked two big questions.

Andrej Zito 

Okay, let’s break it down.

Jeff Beatty 

Let’s break it down. What was the first part of the question?

Andrej Zito 

If you’ve always had this model, that you allow partial localization,

Jeff Beatty 

We have not. At one point. That was actually one of the bigger changes that I also brought when I took over the team. At one point we were aggressive about, if you don’t bring this localization of Firefox up to 100%, complete, we’re going to drop your locale. And we realized, I realized just how terrible of a process that was, and how much of a, I would even say, abusive method of localizing through volunteers. So we ended up moving more toward this, we’ll take what you can give us approach. And then I’ll left it to me and my and our program managers to advocate to product management inside of Mozilla, that this is one of the trade offs that we are accepting as part of a community driven project. And so where they might have come from Apple or come from other places where you just don’t ship something that is partial partially complete. We have to do a little re educating and help them understand that with a different model, comes a different set of acceptance criteria for shipping. And most of them have really come around to that way of thinking. And that’s that’s been good, but it hasn’t been without its trials and attempts to re educate product managers. That’s the first part of the question. What was the second part of your question?

Andrej Zito 

Yes, sorry. When you say most of them, does it mean there are still some product managers within Mozilla that only allow full localization? Otherwise they’re not shipping?

Jeff Beatty 

Yes. Yes. There is only one product that we have that has that policy. The reason being that there is technically no such thing as an English localization for that product. So to ship something partially localized English is still the source, but they’re not shipping an English version of the product. So to have something partially localized, that means that you’re exposing the user to a language that not only did they not expect that in that mixed content, but also that they cannot select from in the application settings. So there are very specific requirements there that inform that decision. But by and large, I think people have bought in. And on top of that, I mean, every time we get a new product manager, or somebody new, some new engineer or something that has come from outside of Mozilla, there’s an opportunity for re education that we have to take advantage of. So it’s not even so much that it’s not that we’re not convincing enough people or persuading them, this is the way way to do things. It’s that there are always new people. So that re education is is an evangelism of this policy of this unique POS process is really an ongoing effort.

Andrej Zito 

So the second part of the question was, if you guys allow partial localization, does it mean that me as a user, when I select a rare language, are the chances are very high that most of the strings will not be localized when you release a new feature doesn’t mean that I’ll see a mix of English and my my own language?

Jeff Beatty 

That is a great question. And actually is a good segue into into technology. Yes, that is, by virtue of the localization paradigm, what a user could come to expect. If you are using a partially localized product, you would see mixed English or source language with target language. Now, that being said, we have recognized that that’s not an ideal paradigm to adhere to. And so we developed a technology called that we’ve called Fluence that breaks that paradigm. And rather than treat localization as a binary setting, in a language in a user’s profile, we treat it as a cascading fallback. So within Firefox, rather than going and choosing one language, and your user preferences, you actually can select a list of languages, a prioritized list of languages that you understand. So what this does is it enables us and what fluent enables us to do is it allows us with each UI string to make an API call to see if there is an available string for the top preferred language. And if so it retrieves it and displays it to the user. If not, then instead of immediately falling back to English, falls back to the next preference, and then the next preference and then the next preference. So actually, some builds you might find are mixed localization of two or more languages. But they are all languages that are based on the user’s set list of language preferences. So for languages like let’s let’s, let’s say an under an under resourced language. In an area of the world, where there’s been colonization, of more widespread language, you might find that part of the localization is available in the under resourced language, but then anywhere where it’s not available, the more widespread language is in place, I hesitate to call cottolin and under resourced language, but Spain is a really good example of where this is interesting, where this is an interesting concept. If we have a Catalan localization that is partially localized, it’s very likely that users in Spain that are using those Catalan localizations have as a fallback, Spanish. So rather than having a mix panel on English, we can serve them Catalan Spanish, and they’ll still know what is going on. We’re still addressing the user’s language needs in terms they understand.

Andrej Zito 

Before we really go into the tools, I still had one more question when it comes to the quality. So you mentioned the process that happens before translations get into the repository. I’m wondering if me as a user can also say once I use your products that I think there’s something wrong with this drink. So you get another level of feedback on the quality.

Jeff Beatty 

Yes, you- our issue tracking system is also open. accessible through through the browser. So you go in and and you file a ticket that gets tracked by product and components. And those components usually are locale coats, the community has access to those bugs. And when they see it come through, they’re notified via email. And they can, they can proactively address that feedback and make changes in pontoon. What happens very often is that communities actually take that bug report and leverage it as an opportunity to recruit someone new. So if you’ve cared enough to submit a bug report, actually, you know, that string is found right here in pontoon. You can go and correct it if you’d like to. And then we get we get a new volunteer.

Andrej Zito 

Right? Yeah, that’s not a way how you can build the community. Yeah. Nice. Okay. So let’s get into the technology part of our interview. So you already mentioned fluent. To me, you describe it from the user perspective, which I really like. But when I did my research, it says on your website, that fluent is a family of localization specifications, implementations and good practices developed by Mozilla. So to me, it sounds like a bigger concept than just being able to select cascading level of languages.

Jeff Beatty 

It is it is a much bigger concept. Absolutely. It’s a bigger concept that ultimately has the user end user experience and as the focus, but also focuses on creating good experiences for developers with localization. So there are a couple of paradigms that are common in software localization, the one is one to one language matching, meaning that for every English sentence, there is one corresponding sentence in whatever the target language is. The other is that localization takes place, or localized products are produced through a build system, or at build time, right? You take language resources, string resources in Russian, and it’s at build time that those resources get incorporated into the build and into the executable file. And that is unchanging until there is another build. So it’s all static, right? fluent aims to disrupt both of those paradigms. What we do with the one to one matching is that we actually keep the languages resource files in isolation, so that any changes to the Russian file doesn’t don’t impact the changes to the French file. And we’ve enabled a dynamic scripting syntax, so that when a message is formed in English, if there are variants to that message in Russian, either caused by the need to express different plurals or changing plurals, dynamic plurals, or maybe grammatical case, or maybe gender, then you can even create those variants and conditions under which the one variant will appear to the user over another. So what one message, you know, hello to three of my friends, or hello to any of my friends, right? In English, which is just one, one message, or one string, in Russian could be a very complex Hello to all of my end, friends and friends is expressed as this for 11. Friends, it’s expressed as this for 20, friends, this sprint, and so on. So there are a number of variants and we’re actually taking and we’re making localization, a one to many match in strings instead of a one to one match. And this is reflected in how many, many TMS and cat tools actually use complex messages from ICS message format. For instance, what many of them do is will you’ll have a English screen English string that requires a plural in a specific language. And what they’ll do is they’ll they’ll blow that one string up and have repetitions of the same string in order to provide a corresponding translation for the right number. So it’s still a one to one match. What we’re doing is we’re allowing all of that variants of that variation to take place within the same message and have that dynamically update as variables change That’s only possible though, if you can have what’s called a runtime localization system. runtime, meaning that localization takes place on the client side, not at build. So you can build language resources in there. But what fluent does is it enables you to use an API calls on a string level basis, to pull content from a remote place. So that content is actually being dynamically updated without requiring a new build.

Andrej Zito 

I think I get that part. But does it mean that in order for me to use languages, I always need to be connected to internet so that I can download the content through the API’s?

Jeff Beatty 

It means that you, you get the latest content when you’re connected to the internet.

Andrej Zito 

Is there some like let’s say, vanilla version that’s already integrated into the build. So when I download something, it downloads like, let’s say, some older version of the language, if you know what I mean.

Jeff Beatty 

Uh, huh. As of right now, that is the approach that we take, we still compile string resource files into builds. But the more that we utilize fluent in our applications, the more dynamic updates we can send to users, and the more we can fall back, and the more that we can use additional variants of, of existing messages.

Andrej Zito 

And is one that exists mainly for your own developers, or is it something that other people could use when they write software?

Jeff Beatty 

We saw these paradigms as paradigm outdated paradigms that needed to be shaken up within the industry as a whole. So we developed fluent with with Firefox in mind first, and informed by experiences that we’ve had localizing Firefox. But the intent is that this technology could be useful in all web apps, software, so that we can we can improve upon the end user experience and make sure that language needs are being addressed. dynamically, and in the most natural way possible.

Andrej Zito 

I’m still a little bit confused, like, Is it some? Is it like a guidelines? How you should write your app better in existing programming languages? Or is it some new technology that sits on top of the existing programming languages?

Jeff Beatty 

It’s new technology that sits on top, or that is is embedded or ported into programming languages? So it’s a set of libraries and syntax and specifications that enable runtime and complex messaging.

Andrej Zito 

Got it. Okay. What about Pontoon, I’m pretty sure that these two are very closely correlated, you already hinted of what pontoon is, I would assume it’s kind of like your own TMS solution. That Yeah, has the community in, in the first mind?

Jeff Beatty 

Yes, the primary user story for pontoon is the community user story. Yeah, pontoon is our centralized translation hub for all community localization. We localize all of our products, user interfaces through there, as well as many of our websites get localized through there. It is a translation management system. One thing I like to tell people, because a lot of people when they see pontoon and anybody can go and see it@pontoon.mozilla.org, you can go in and you can make your first contribution. You can see how things are organized. A lot of people give us very positive feedback and say that it’s it’s really great. It’s a highly collaborative experience. And they want to use it for their own projects or their own products. And what I often like to say is, pontoon is a fantastic TMS, for what Mozilla needs. It is not a great TMS, for what anybody else needs. We have really optimized around the Mozilla set of needs, and the Mozilla community’s set of needs. The list of supported file formats is very small. It only really supports string based localization. So anything that has UI, anything and prose, there’s there are no internal segmentation rules for it to take a paragraph and break that up into individual sentences. There’s no rich editing. So So yeah, it’s it’s if you look at pontoon and compare it against each other cloudbased TMS is you’re going to find that it falls short. But it is very much in line with the needs that we have. And we’ve we’ve optimized it around our use case. It also currently is the only tool out there that supports the complex messaging that is possible in fluid.

Andrej Zito 

So if I’m thinking how to ask this, there are two questions that I wanted to ask. So okay, let’s start with this. So let’s say I’m a software company, let’s say, I don’t know, Google, maybe not Google, let’s say Autodesk. I used to work at Autodesk. And I want to experience with community localization. What tool should I look for? Is there actually anything like that in in the market? Like the commercial tmss? Or could I somehow adopt pontoon? Or would I need to develop my own solution?

Jeff Beatty 

There are plenty of options out on the market already. There are open source options like pontoon and pontoon is deployable, we’ve had a number of different individuals and companies come to us and deploy their own instances of pontoon. There’s also poodle, which is a solid community based translation management system. But then you also have other platforms that were initially built around this collaborative model. You have trans effects, that does this lingo, tech did it as well. I believe smartlink also has community or crowdsourcing capabilities, and crowdin is another another option. So there there are definitely options out on the market that people can look toward. So my first question to people that come to me and say we want to implement a community model. My first question is why? Because a lot of the times the assumptions that people have, of the value proposition and using a community model are false. A great example of the first one that always comes to mind is, well, it’s free. We’re going to get cheap translation. You’re not common sense advisory actually produced a couple of reports, I think it was Natalie Kelly, actually, that produced the reports or maybe Rebecca Ray, where she they looked at not only Mozilla but also Facebook, and a few other companies and implemented crowdsource for community localization and found that in some cases, it actually ended up being more expensive for the company to do it, then it saved anything, it was either neutral or more expensive, in most cases. And speaking firsthand, that is absolutely true, it is more expensive, or it can be more expensive. So I always try to find out why, before I jump to, oh, you should use this tool or this technology. Because very often those assumptions need to be checked. And by the end of the conversation, very often people say okay, maybe this isn’t the thing that we wanted.

Andrej Zito 

Why is it Why is it more expensive? Is it because of the cost associated to to get the crowd together and keep them engaged? Or is it because the quality drops down and you have a loss of maybe business because of the

Jeff Beatty 

Cost changes because it’s not, it’s no longer measured in a per word rate. Right? Where you are paying for translation, you’re paying at a specific set of rate with a vendor. Your costs are no longer going toward language services, your costs are going toward staff or toward technology or toward incentives for the community. So it’s, it’s that it’s going toward different areas, and you end up having to have either your own in house solution, or you need to have more engineers, on staff to be able to customize crowding or customize whatever other solution you have, then you probably would have needed if you’d have gone with an agency or a vendor. Right? So it’s more that your cost distribution changes. And because of that change, it can become more expensive, because you’re now talking about human resources and the overhead that accompanies hiring more staff.

Andrej Zito 

Right. So how did you guys make it work?

Jeff Beatty 

We made it work because it happened organically. We started as an open source project. All code contributions were made by volunteers. We happen to have volunteers and making code contributions on other in other parts of the world. And they would actually take CD ROMs with Netscape or a Mozilla sweet on them. They would rip apart the executable, re engineer it to have German straight and translated strings, rebuild recompile it and send it back to Mozilla headquarters and say, here you go. I’ve made this. So it’s part of our DNA, doing localization in the community way. And it’s largely because are very, very early in the inception of Mozilla. It started that way organically. And we centered portions of our mission, about the internet that we want to see. And the products that would reinforce the internet, we want to see around accessibility and the internet as a global public resource.

Andrej Zito 

I see. So it’s more like, like, your whole philosophy is built around that. So like, even like, out of departments within Mozilla, they don’t they don’t even know about localization. By the way, how Mozilla functions and works, they technically support this community model, right? While in fact, if you have like a typical traditional company, let’s say Autodesk, and they just want to experiment with a community localization. That’s right, it would probably be more expensive in the end, right? Because they would need to change a lot of things,

Jeff Beatty 

Yhey would need to change a lot of things. Yeah. And there’s also no guarantee that if you build it, people will come. So it could end up being an ultimate waste of money in trying to change a lot of things, hire new staff, and then realize that it’s not going to work. Yeah. A great example of that is LinkedIn, I believe, where LinkedIn attempted to crowdsource the localization of their site, only to receive a lot of backlash from users who are using the site in order to find professional work. So just because they built it didn’t mean that that was a successful endeavor to implement that model. x

Andrej Zito 

Right. I know when we had our first introduction call, you mentioned that the term crowd sourcing or crowd localization was hot, like 10 years ago, why do you think die did die? Or is it is it only like few companies can make it work?

Jeff Beatty 

So I think that it takes a unique setup and a unique investment in order to make it work. And what’s happened over the last 10 years is that people have been very excited on you about the idea of crowd localization. But for the wrong reasons. And when they start really investigating how to create a successful community led or, or crowd sourced translation program, they realize that is high in costs, and that their assumptions upon which they build their value proposition on were were false. Anyway. What I think has happened actually, over time, is that learnings around setting up these programs have been distributed more widely. And so companies aren’t so quick to look at that and say, Oh, great, that’s gonna save me money. Instead, they aren’t, they now have resources that they can go to, and read up on the successes and failures of some companies over others, and implementing these models, and check those valid and validate those assumptions themselves. So rather than going around using this bug buzzword, they actually have the resources now to invest in determining whether it’s a viable option for them right from the beginning. And at that point, the reasons and the assumptions that impel a company toward that model, have to be about the users, they have to be about the community, they have to be about a philosophy. And because you have to make concessions in order to make that model successful. So I think that over time, people have realized that it’s much more complex than I’m going to save money. And they’ve seen successes, they’ve seen failures. And they invest more time in really thoroughly checking their assumptions before they begin advocating that that is the value proposition that they want to pursue. I don’t think that it’s dead. I think that it’s just that people have realized, like with machine translation, people have realized that there are models that are very suitable for different types of content and under different sets of requirements, right. And they’re equally as valid. Like I remember, if 10 years ago, not 10 years ago, 12 years ago, going to the at a conference and hearing people talk about how terrible translation memory was. And how it’s gonna take take away people’s jobs, and how there’s just not enough room for translators and translation memory. And the same context was given for machine translation, like machine translation is going to steal all of our jobs. And then it was also applied to crowdsourcing. crowdsourcing is going to steal all of our jobs, know, what actually ended up happening. And what we’ve seen happen is that more content got translated. And the position of translator elevate, it was elevated. It was now a premium to have a professional take a look and and be involved in the translation process. Whereas before it was a little more of a commodity. So we we’ve been able to identify that there are suitable models for translation and localization that are not the traditional model, but that they apply in small spheres, not at scale, across any organization interested in implementing them.

Andrej Zito 

Yeah, speaking of MT, one thing that I wrote down here is that I think this also from your LinkedIn profile, it says, establish code Development Partnership with sis Tran with the goal of democratizing language resources for machine translation. So, before we go into, like, how you guys utilize mt, I want to first learn what the democratisation means. What did you mean by that?

Jeff Beatty 

What I mean, is that this is another thing that the language industry is starting to notice. And it’s I think, primarily driven by Taos, core Pura and data sets that are useful in machine translation, are largely contained within individual silos. And there are very few places where you can turn to if you are a new company trying to build out machine translation engine, there are very few like data set brokers or data brokers in language, right, or incorpora. A lot of that content and, and those those data sets are owned by Google or owned by Apple or owned by Microsoft, and they’re not shared very widely. So we had an opportunity with sis Tran, and sis Tran being built on the open nmt project from Harvard, we had an opportunity to actually take our data sets that are already open sourced, I know and under the Mozilla Public License, and incorporate those data sets into an open neural machine translation training tool and train open source models for neural machine translation, largely because we already have datasets that many other companies don’t. And it’s not just Spanish datasets, it’s Mayan data sets, it’s sorbian data sets. And so in that way, by by pursuing a co Development Partnership with this trend, we have an opportunity to make use of our data sets in a new context, a modern context, that actually opens up and expands the opportunity is for any given language to play in a space that they couldn’t play in before. And thus giving companies and other organizations the opportunity to localize software, and languages they couldn’t consider doing before. So in that sense, we’re democratizing it, we’re taking our data set, we’re making it open, we’re creating tools to try to encourage companies to say, look, this is open, this is available to you localize your stuff in Serbian, there are people that could use it, and now you have the resources to do that. Or do it in Welsh, or do it in Irish or whatever other language, right that we have data for. Let’s Let’s democratize this, and allow contributions of volunteers to have a greater impact on the entire ecosystem.

Andrej Zito 

So does it mean that if I now volunteer to help translate Firefox, does it mean that my translations also help train this nmt? With Systran?

Jeff Beatty 

They can, yes.

Andrej Zito 

What does it mean they can?

Jeff Beatty 

Well as with all neural machine learning training processes, there is a data set threshold requirement.

Andrej Zito 

Like a minimum?

Jeff Beatty 

yes, you need to have a certain amount of data in order to provide enough data for the machine learning algorithms to adequately train a neural model. And so I say you they can because until the point that they reach that threshold, we really it’s impactful because we can Ship products, we can make a language available on the web. But we can’t like that’s that’s kind of the top tier threshold, right? So until you meet that minimum contribution requirement, we can actually proceed to develop neural models for you. So the more you contribute, the greater your impact, effectively.

Andrej Zito 

Just an amateur question. Can you like quantify how many strings or words there need to be translated, in order to start developing the nmt for a particular language?

Jeff Beatty 

I don’t recall the exact number, but I know that it is in the 10s of 1000s of translation units are like aligned segments in a corpus.

Andrej Zito 

And are there other companies contributing to the same nmt or is it just you?

Jeff Beatty 

It is, so Systran announced a marketplace that serves as kind of a neural model broker for nmt. Using this as trend engine, we are one of a handful of companies that are considered early trainers, where we are training our models on our data sets and making those models available in that marketplace.

Andrej Zito 

Okay, speaking of Mt. Can you briefly describe what is Bergamot?

Jeff Beatty 

Oh, it will have to be very briefly, I’m not fully involved in that project. bergamont is a project that I think the University of Edinburgh has evolved in as well as the European Union to create a client side and machine translation engine and attach it to the browser. The purpose is in order to help reinforce GDPR compliant and privacy compliance laws. So that data that is being used in these machine translation services can stay locally, and don’t have to be communicated out to a server in an unknown, unknown location. So that’s, that’s, that’s the project that Mozilla is a partner in developing that project, because we’re providing the browser and that ecosystem to help experiment with a local client side machine translation engine.

Andrej Zito 

Mm hmm. Can we be more specific how it works normally? Does it mean that when I use Google Translate, and I type in something it is stored? And that becomes the data of Google and they use it further?

Jeff Beatty 

Exactly. Exactly. And the goal of this is to not necessarily, I don’t want to say something, that that’s the wrong word. The goal of this is to try to keep an individual user’s data on their machine, and grant them complete access to that, and control to that. Whereas once it’s sent off to a server somewhere, they lose control over what that? What is done with that data.

Andrej Zito 

Also, this may be another stupid question. But if I’m having something translated through Google Translate, and it’s sent to a server, so yes, it stores the information that I wanted to translate. But doesn’t it also help make the machine translation better? Does invest, right? So in this case of the client side empty? How does the empty get better? If it doesn’t get the data from the users?

Jeff Beatty 

That’s a great question for. So I, what I assume is that the amount of available data is confined to what is stored locally on the machine. So then, the more that that individual user is using the engine, the better the engine is becoming. But it’s not that it’s gathering data from a centralized location. Or at least it’s not that data that is coming from a centralized location has first been sent there by the user, if that makes sense. And that’s those are probably as as in depth answers is like, yeah, I’m not I’m not overly familiar with what’s going on in that project.

Andrej Zito 

Yeah. Let’s wrap up this machine learning and speak about common voice and deep speech. That’s one of the areas that you already touched on, and that’s where you’re helping to create datasets. So I guess we’ll start with common voice, so called voices Mozilla’s initiative to help teach machines, how real people speak. And from what I saw from the website, you can sort of donate your voice and you can also validate it Can you explain in more details how this whole thing works? And why did you even like started working on this initiative?

Jeff Beatty 

It all goes back to the democratization of language, right? We observed that voice was the next frontier, in human computer interaction. There’s a lot of complexity around that. And there is a need to have a vast amount of vocal datasets in order to create great experiences for people in technology, great voice based experiences, a lot of that data, however, is siloed. And so the the problem that we saw was that it’s that these datasets were siloed, behind Amazon, or, or other companies. Or that there were particular licenses that prohibited their use beyond a specific purpose. So what we sought out to do was to take advantage of the fact that we are community driven, and create a project where we could collect voice samples from our community, validate those samples, and generate a an annotated data set that could be used outside of Mozilla, effectively democratizing and breaking up the silos, for voice datasets. So that was, that was the main goal. The way that this works is that we pull in content, that is cc zero, I believe, Creative Commons, zero licensed. And a user goes into voice.mozilla.org, they create a profile, and you can read sentences that come from those corpora. And then you or someone else can listen to those recordings and validate that that the recording, actually is what is linked to the text, what is what the text says. Or you can address the quality of of that recording. For languages where digital resources are scarce. There’s a sentence collector tool that actually allows people to come and write their own sentences. Or if they find sentences from open, licensed places, but they’re not digitized yet. They can actually go in and manually type those sentences. And then they get fed into the pipeline, and common voice.

Andrej Zito 

Do you have any goals of like, how many voice lines you want to collect? Or is it a never ending project?

Jeff Beatty 

It’s, I think, by and large, it’s a never ending project. But like with any machine learning project, there is a minimum threshold before that data set can be viable. I’m not familiar with what that threshold is.

Andrej Zito 

And any company or any user can come and download this data set and do whatever they want to do it. How does it tie to deep speech? So that’s your voice recognition engine, from what I’m

Jeff Beatty 

So common voice feeds deep speech. It feeds deep speech with the data sets in there.

Andrej Zito 

But deep speech is an engine. Again, it’s open source, it’s something that anyone can use for their apps.

Jeff Beatty 

Yep. Because again, a lot of the voice recognition not only the datasets, but the technology is very siloed. And we wanted to break up those silos and give create more opportunities for people to create cool software with good voice experiences.

Andrej Zito 

Okay, let’s let’s talk a little bit more about you personally. So as a senior head of the localization at Mozilla, wondering, how does your day look like?

Jeff Beatty 

It’s a lot of emails. Looks like looks like a lot of emails, it looks like a lot of meetings. I manage a decent sized team. So there are a lot of one on ones with my team members to try to help make sure that I can unblock them on different projects. They’re working on making sure that they’re, they’re happy and that they’re they’re enjoying working on the team and on the projects they’re working on. I have a number of meetings with senior leadership where we talk about the localization program holistically. I have meetings with community members. Occasionally I’ll have one on one meetings with them to mentor or give advice. To some extent, and then it’s a lot It is a lot of emails. It’s also a lot of doc document writing a lot of creating resources that describe how different processes work or defining specifications for new features or monitoring and creating roadmaps for Pontoon and Fluent. Yeah, that’s that’s my average day.

Andrej Zito 

When do you typically start your day?

Jeff Beatty 

So half of my team is in Europe. So I start my day at five in the morning, oh, I’m based in Utah. So I will start my day at five in the morning, and aim to have meetings with those members on my team based in Europe, early in the morning, my time so that they can have a stable work schedule, that doesn’t require them to stay up until all hours of the night. So I tried to structure my days around the around the turn of the earth so that I’m accommodating everybody’s unique schedules. And their unique time zones,

Andrej Zito 

Do you have like a hard stop? Or do you take a break between or you just go,

Jeff Beatty 

I try to I try to stop at no later than two or three. Very often I’m able to get away with one. I just power through from five to one. And then I’ll go and have lunch and enjoy the rest of my day. Sometimes that doesn’t always work. I mean, slack is a blessing and a curse. Right? And so where I once were, I was able to ignore emails until the next day, you can’t ignore slack pings. The same way? So yeah, I that’s kind of how I tried to structure my day, five to one or five to three at the latest.

Andrej Zito 

What are you curious about right now?

Jeff Beatty 

That’s a great question. I have recently started a product leadership certification course. Because I am interested in understanding the thought process behind products decisions. Similar to project management, where I’ve learned a lot on the job, I’ve learned a lot about product management on the job. I can’t say that I do it incredibly well. And I’m hoping that this certification course helps enlighten me, and give me a better framework for being able to speak more effectively and work more effectively in that in that capacity. So right now, professionally, my biggest curiosity is around decision making for products. And that could be analysis of different data sources, it could be an analysis of revenue. I’ve only had one class so far. And it’s been it’s been very, very interesting. The other things I’m curious about on a more personal level, my my heritage is from Sweden. And so I’ve always had an interest in the Swedish language and Swedish culture, Icelandic culture as well, because I think part of my ancestry comes from there. I really enjoy understanding new cultures. I also recently went and was certified in cultural intelligence, through the cultural Intelligence Center. What that enables me to do is to give people access to a empirically based assessment, self assessment and 360 assessment on cultural intelligence, and then advise them on well interpret results of that assessment to them and advise them on how to improve in their cultural intelligence. So cultural intelligence, Product Management, those two are really big interests of mine right now. And I think that they come at a really interesting time, because I did that certification course, before. The Black Lives Matter movement really kicked up in the last two weeks. So this was at the beginning of May, that I did this. And while there had still been protests and everything, it hadn’t been at such a forefront in the minds of the American people as it is right now. So I’m, I’m finding it very useful, especially my cultural intelligence training, to take a look at the situation, assess the situation and determine how I can, what motivations I have to understand more what’s going on out in the world and acknowledge my own privilege. biases and understand more where people are coming from and take action to be more respectful of diversity. So yeah, all of those things combined. Those are really what I’m what I’m curious about. As far as the industry is concerned. I am curious to see how far we can take machine translation, and how far it can be used useful. I’m I have a growing concern that machine translation is collectively creating a scenario where our standards for translation quality, are lowering as a society that’s creating more use cases for empty, but also eliminating use cases for human involvement, and empty. So I’m curious to see if there’s any research or investment that goes into that. I’m also curious and interested in helping companies and seeing how companies respond to the UN Declaration on indigenous languages. And whether that spurs any action to help localize or bring these communities into a digitally equitable ecosystem? Yeah, well, a lot of different things.

Andrej Zito 

What do you think is wrong with our industry?

Jeff Beatty 

We’re too quick to dismiss good ideas.

Andrej Zito 

Do you think it’s specific to localization?

Jeff Beatty 

You know, I don’t necessarily think it’s specific to localization. But in my firsthand experience, and granted, I know, this is just my firsthand experience. And I’m generalizing that, for good or for ill. In my experience, there are very few disruptions in our industry. And I shouldn’t even say that we’re too quick to dismiss good ideas, we’re too quick to poke holes in ideas, and not identify the value in them. So to assess an idea when it’s presented, right. Think about how long it took for the industry to really embrace translation memory. It took forever, right? The same is true of exchange standards, like data exchange standards, like DMX, or TB x, I think TB x still struggles a little bit to really catch a foothold in in technology. If I go to any TMS or any cat tool, TB x, may or may not be listed as one of the file formats that they support for terminology. Storage, very often it’s a spreadsheet or a CSV file or an XML file. It’s it’s not actually TB x itself. So I think that we’re too quick to dismiss ideas without thoroughly assessing them. And there are not enough people willing to take risks and create disruptions in our space.

Andrej Zito 

Mm hmm. Do you have any personal way how to solve this? Or do you empower your people more when it comes to presenting ideas to you?

Jeff Beatty 

I would like to think that I am an open book and receptive to ideas when they come. But I also acknowledge that I have my own biases and my own. It can depend on the day, right? If I’m tired, I don’t want to listen to ideas. I try to be more of a multiplier and help individuals assess their own ideas and their own assumptions and take those another step further. or help them to refine their ideas into a place where it could be taken an additional steps or additional steps could be taken, I should say. Yeah, I’m not sure what to do on a high a high level, I don’t think that there really is an action that could shake the whole industry and every individual in it. So I look at what I can control. And if I can be part of a conference and speak about something controversial, or think about something speak about something that is that is unique about the way that I see localization, then I try to take advantage of those opportunities. The same with on LinkedIn. Like I recently made a controversial statement on LinkedIn about ROI being a terrible metric for the for localization success, and that we as an industry should stop talking about ROI. I won’t get into that.

Andrej Zito 

I think I saw it. Yeah.

Jeff Beatty 

But it’s it’s, yeah. It’s being taking looking at what you can control and being vocal. And when you have a space to be vocal.

Andrej Zito 

This is a, this is a good point for my next question. Are there things that you change your mind about throughout your life or career? And it doesn’t have to be localization specific.

Jeff Beatty 

Oh, absolutely. Absolutely. If you’re not, if you’re not constantly checking your assumptions, then you’re setting yourself up to fall on your face. And I have a number of life experiences that have taught me that just how critical that can be. For one, I’m colorblind. And so it’s not to say that I don’t see color. It’s that I can’t differentiate. Often, we had in my house, we had an apricot tree in our backyard, I could not see the apricots until they were on the ground against a dark, Earth toned background, but up in the sky compared against blue in the green leaves, I couldn’t see them at all. So I’ve come to acknowledge that my perception of reality is distorted in many ways, because of just the color blindness, and that that has carried over into other aspects of my life, where I am always trying to check my assumptions. And in doing so, I find myself changing opinion, or that my opinions about certain issues evolve over time. Another great example, and I know that this is probably getting a little too personal. But I was diagnosed with cancer several years ago, I’m okay, now, everything’s fine. But in the process leading up to that I had also been diagnosed with severe depression. And part of the treatment for my cancer was to remove my entire thyroid gland, and replace the hormones that are provided by that gland with medication. When I started taking medication, my eyes were were opened. And I I feel like emotionally, I went from seeing the world in black and white to seeing color for the first time, in a long time. And that experience taught me that outside influences can have a strong hold on the way that you see things. Your internal influences can also have a very strong hold. And sometimes what it takes is for those external influences to interact in a specific way with those internal influences, to try to create an evolved perspective. So both of these experiences, and both of them are health based and physiological, right? have kind of taught me that in all things in life. Reality is not as I perceive it. And if I’m not open to checking my own assumptions about reality, then I’m I’m failing.

Andrej Zito 

Is that like your internal discussion or like self reflection when you check your assumptions, or do you ask explicitly for a feedback from other people,

Jeff Beatty 

It’s a combination of both. One thing that the experience with medication with thyroid medication taught me was that my own instincts are not trustworthy. And that even my own self reflection has to be validated, or the assumptions that come from that have to be validated. And the only way to do that is through either action, acting on those assumptions and seeing how those play out or it’s by soliciting input from someone else.

Andrej Zito 

Yeah, I didn’t know that you were colorblind, because I just think like, like your background is like the best among all the guests that I have is like so beautiful like that, that red box that you have there and the guitar. It’s like a very nice contrast to the rest of the room.

Jeff Beatty 

Thank you. Yeah, I can see the red box. And it’s and I know that it’s red, huh? That’s that’s saying something.

Andrej Zito 

Are there any absurd or stupid things that you do? all the time?

Jeff Beatty 

All the time. I can’t think of anything offhand. I I’m a father. And I think that by default, that means that I love puns. There’s no escaping it when you become a father. Pun based jokes are just the best. And it’s so easy to identify them. I I am a silly, silly person. Like when it comes to puns, when it comes to a lot of different things, I’m the type of person that especially for the sake of humor, will try to take things as far as they can possibly go. And that often means that I’m doing stupid things. Or saying stupid things or outrageous things. absurdism Oh, that is my, my cup of tea, when it comes to humor, and that often leads to me doing Dumb, dumb things.

Andrej Zito 

We were talking about this before we started the interview, like, you know, saying things that might hurt other people. We also get like a backlash when you do when you take your jokes too far, which happens to me as well. Sometimes.

Jeff Beatty 

I know my audience. And so I experiment with the absurdism and places that it’s appropriate. So with people that I have trust, my my wife, my kids, my good friends, some of my co workers, I know when to turn it on, and when to turn it off. Sometimes that means that when I’m turning it off, I’m laughing hysterically inside. Because I have thought of a hilarious thing that is either like too dark or too, too inappropriate for the situation. And I’ll kind of look foolish because I’m standing in a corner laughing, thinking that I’m just so clever and funny.

Andrej Zito 

I’m wondering how many interviews we need to have for you to turn them on. So that you can trust me, we can do like a comedy show.

Jeff Beatty 

It’ll be many. It takes it takes me a long time to get to that place.

Andrej Zito 

Okay, let’s get let’s get serious or maybe not maybe no, it’s the time for you to make your opponents final words from from Jeff Beatty. When you could speak to the minds of everyone in the localization industry? What would be your last message?

Jeff Beatty 

I think that my last message would be everyone has a right to information everyone does. Sometimes that’s going to bump up against the bottom line. It doesn’t always have to. But there is virtue and integrity and ensuring that your products, your information are accessible to the world. And as long as you are striving to make that happen, I think that you’re making a positive contribution to the world.

Andrej Zito 

Thank you. Thank you again, Jeff, for your time. Thanks for the lovely interview sharing the different culture that you guys have at Mozilla, and your own life philosophy.

Jeff Beatty 

Of course. Thank you for having me.

Andrej Zito 

All right. Bye bye.

Jeff Beatty 

Bye.

We’re always creating new localization content

Make sure you don’t miss anything. Join 5098 other professionals on our mailing list and be the first to get our upcoming newsletter. 

If you enjoyed that, you’ll love these…