Professionalism/Google Employees and Project Maven

From Wikibooks, open books for an open world
Jump to navigation Jump to search


Google[edit | edit source]

Google is a California based tech company specializing in internet services. They focus on online advertising technologies, a search engine, and cloud computing. Google is considered to be one of the top 4 US based tech companies, often referred to as the Big Four, alongside Apple, Facebook, and Amazon.[1] In 2019 Google earned $120 billion in revenue, an estimated 83% of this came from advertising. [2]

Google Employees[edit | edit source]

Google software developers earn exceptionally high salaries with college-grads earning an average total compensation of $180,418 according to crowdsourced data.[3] This is almost double the national average salary for college grad software engineers of $99,917. [4] Employees are also incentivized to stay at the company as perks such as stock awards take years to vest. Google employees have been known to protest for what they believe in such as a 2019 protest of unjust firings [5] and the 2018 walkout over sexual harassment.[6]

Government Contracts Before Maven[edit | edit source]

Google has received $49 million in payments in contracts with federal agencies from 2008-2019.[7] Most of theses contracts involved advertising services such as a $352,200 contract with the Department of Health and Human Services to place HealthCare.gov ads on third-party websites and Google search results. [7].

Project Maven[edit | edit source]

The DOD's Problem[edit | edit source]

The Department of Defense (DOD) collects a large amount of full-motion video data every day. Military and civilian analysts had to manually analyze this data in support of counterinsurgency and counterterrorism operations, so the DOD needed a way to efficiently convert the data into actionable information.[8] In addition to this data volume issue, the DOD’s recognition technology could only identify simple objects, such as cars and people, but not more complex situations such as those involved in the military.[9]

The Project[edit | edit source]

Project Maven, formally known as the “Algorithmic Warfare Cross-Function Team”, was launched in April, 2017. This was a project with the DOD exploring the use of artificial intelligence (AI) and object recognition in the battlefield. The project created a surveillance engine that uses wide area motion imagery data captured by government drones to autonomously detect objects of interest, track their motions, and relay results to the DOD.[8] The system uses machine learning to continuously improve analyses and simplifies work for analysts by autonomously flagging objects in footage to be reviewed by analysts, rather than analysts manually examining all of the data themselves. These algorithms do not select targets or order strikes,[9] however many concerns were voiced about the intentions and potential outcomes of the project and Google’s involvement.

Opposition[edit | edit source]

In April, 2018, one year after Project Maven was launched, over 4,000 Google employees signed a letter to Google's CEO, Sundar Pichai, asking Google to withdraw from Project Maven.[9] The letter included employees’ concerns that the U.S. military could weaponize the AI technology to refine drone strikes and other lethal attacks, suggesting that the project could contribute to drone warfare operations.[8] This did not align with Google’s core values as a company and employees did not expect involvement in this work when joining Google. The contract would also not contribute a significant amount of revenue, only awarding Google $9 million, so many found that the potential damage to Google’s brand and decrease in public trust was not worth the risk.[10] In June, 2018, two months later, Google announced their withdrawal from the project, deciding to not renew the contract when it would end in 2019.[9]

Support[edit | edit source]

Those in support of Google’s involvement in Project Maven argued that the U.S. needs to keep up with other countries in the AI arms race, as many other countries are developing AI-enabled military operations. Supporters also contended that someone would bring AI to warfare, it was just a matter of who did it. If Google did not, then the DOD would proceed even if it had to rely on lesser experts.[9] As Google is known to have some of the most talented engineers, giving the contract to a less qualified company would lower the quality of the resulting product.

The DOD's Initiatives[edit | edit source]

The DOD established two initiatives in response to Google’s withdrawal from and public concern with Project Maven.

  • Joint Artificial Intelligence Center (JAIC): A Center of Excellence to manage all of the military's AI operations. The center had an initial focus on humanitarian efforts to boost public relations. The first major initiative was to use AI to organize the military's search and rescue response to natural disasters.[9]
  • Defense Innovation Board: An advisory panel of technical experts devoted to review AI ethics and develop principles for military AI use. The panel would hold public meetings with AI experts about how AI should or should not be integrated into weapon programs, then would deliver recommendations to the Secretary of Defense.[9]

Ethical and Professional Considerations[edit | edit source]

Google employees expressed concern that involvement in Project Maven would hurt future recruitment efforts and paint the company in the same light as companies such as Palantir, Raytheon, and General Dynamics [11] who are speculated to have participated in controversial business. [12] [13] [14]

Employees also acknowledged the argument that companies like Microsoft and Amazon participating in certain projects is reason enough for Google’s participation by stating that Google is set apart by its history, its ability to reach the lives of billions of users, and its motto “Don’t Be Evil” [11] which has since been removed from the Code of Conduct. [15] The employees believe that Google should not be involved in risky projects just because other firms are.

Ethical Considerations[edit | edit source]

Employees were not too keen on the idea of participating in a project that would aid in military surveillance and voiced concern about the project being used in more offensive situations. [11] Diane Greene, Google Cloud’s CEO, offered reassurance that the technology would not be used to do things such as launch weapons or operate drones. Employees were convinced that since it is military software there is no guarantee that it will not be used to assist in these tasks. [11] Once the product is delivered, the engineers will have no say in its use.

Professional Considerations[edit | edit source]

Employees were interested in preserving the trust that Google claims to value between itself and users, especially while there is growing fear of biased and weaponized Artificial Intelligence. [11] They believe that Google has a responsibility to never jeopardize the trust that the users have in the company. [11] If it is hard to maintain trust just because people have negative opinions on a subset of what the company does, it does not help to actively participate in things that would directly impact the trust that people have in the company.

Controversial Drone Use[edit | edit source]

One of the employees’ issues was the potential for the technology to be used to operate or launch weapons and drones. [11] There are many advocates of military drone use and there are also people who believe that it is a step too far. One concern is about the less than ideal accuracy and civilian casualties. [16] While exact numbers of civilian casualties can not be determined, especially since the Trump administration overturned the required release of the US military’s annual report of civilian casualties from airstrikes [17], some would consider the civilian death ceiling for tolerable drone strikes very low.

Another concern is the potential for drone strikes to cause more of the terrorism that they are supposed to be preventing. Disgruntled family members of victims of the strikes, or even people who are upset that their country has been attacked, may want to retaliate as illustrated with the “Times Square Bomber.” [18] This creates a terrorist where there may not have been, but of course one could never know if that is the case. While some people think that the good outweighs the bad when it comes to drone use in the military, others may see any bad as a reason to not allow drones to be used.

Lasting Impact[edit | edit source]

Project Maven Today[edit | edit source]

Palantir took over Project Maven in 2019 after Google did not renew the contract. Palantir's CEO, Alex Karp, and Palantir as an organization hold the belief that it's big tech's patriotic duty to do whatever the US government tells it.[19] Peter Thiel, Palantir's founder, even states that "Google should be investigate by the FBI and CIA for the company seemingly treasonous decision to work with the Chinese military instead of the U.S. government."[20] Palantir has worked on many government contracts in the past including creating ICE and CBP's surveillance networks and building software for police that circumvents the warrant process.[19]

Government Contracts After Maven[edit | edit source]

Current government defense contracts operate in an all-or-nothing fashion where companies must provide all services from a contract and cannot split the contract among multiple bidders. In 2018, Google decided to not bid on a $10 billion cloud contract called Joint Enterprise Defense Infrastructure(JEDI).[21] This decision was made since Google could not be assured that the contract would align with their AI principals and only wished to provide services for part of the contract. A google spokesperson stated "Had the JEDI contract been open to multiple vendors, we would have submitted a compelling solution for portions of it,Google Cloud believes that a multi-cloud approach is in the best interest of government agencies, because it allows them to choose the right cloud for the right workload.”[21]

Google AI Principles[edit | edit source]

Since 2018, Google has maintained a list of ethical principles for AI projects.[22] These principles serve as a guide for selection and execution of further contracts with Google. This list is divided into 2 categories: Objectives for AI applications and AI applications we will not pursue. More information can be found on Google's website.

  1. Objectives for AI applications
    • Be socially beneficial.
    • Avoid creating or reinforcing unfair bias.
    • Be built and tested for safety.
    • Be accountable to people.
    • Incorporate privacy design principles.
    • Uphold high standards of scientific excellence.
    • Be made available for uses that accord with these principles.
  2. AI applications we will not pursue
    • Technologies that cause or are likely to cause overall harm. Where there is a material risk of harm, we will proceed only where we believe that the benefits substantially outweigh the risks, and will incorporate appropriate safety constraints.
    • Weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people.
    • Technologies that gather or use information for surveillance violating internationally accepted norms.
    • Technologies whose purpose contravenes widely accepted principles of international law and human rights.

Conclusion[edit | edit source]

Google employees wanted Google to drop Project Maven for a few reasons that were outlined in their letter, but the general reason is that they felt Google should not be in the business of war. Despite containing signatures from only a small portion of employees, the content of the letter brings forth questions that need to be addressed when attempting to discern the morality of participation in a project for both the individual and the company. For some employees, Project Maven did not align with their ethical or professional values and they took steps to correct it. In this case, the employees provided a strong argument that most likely aided in Google’s decision to opt out of renewing the Project Maven contract.

References[edit | edit source]

  1. http://www.barrons.com/articles/ranking-the-big-four-internet-stocks-google-is-no-1-apple-comes-in-last-1503412102
  2. https://www.forbes.com/sites/greatspeculations/2019/12/24/is-google-advertising-revenue-70-80-or-90-of-alphabets-total-revenue/#6eec9f644a01
  3. Levels. From https://www.levels.fyi/?compare=Google&track=Software%20Engineer
  4. Glassdoor. (n.d.). Software Engineer - New College Grad Salaries. From https://www.glassdoor.com/Salaries/new-college-grad-software-engineer-salary-SRCH_KO0,34.htm
  5. Arielle Pardes. (November 11, 2019). Google Employees Protest to Fight for the 'Future of Tech'. From https://www.wired.com/story/google-employees-protest-retaliation/
  6. Daisuke Wakabayashi, Erin Griffith, Amie Tsang and Kate Conger. (November 1, 2018). Google Walkout: Employees Stage Protest Over Handling of Sexual Harassment. From https://www.nytimes.com/2018/11/01/technology/google-walkout-sexual-harassment.html
  7. a b Investopedia. (June 25,2019). "Google Could Lose All Government Contracts Over Data Compliance (GOOG)". From https://www.investopedia.com/news/google-could-lose-all-government-contracts-over-data-compliance/
  8. a b c Kalvapalle, R. (2018, April 04). Google Employees Ask Tech Giant to Pull Out of Pentagon AI Project. https://globalnews.ca/news/4124514/google-project-maven-open-letter-pentagon/
  9. a b c d e f g Frisk, A. (2018, April 05). What is Project Maven? The Pentagon AI Project Google Employees Want Out Of. https://globalnews.ca/news/4125382/google-pentagon-ai-project-maven/
  10. Shane, S., Metz, C., & Wakabayashi, D. (2018, May 30). How a Pentagon Contract Became an Identity Crisis for Google. https://www.nytimes.com/2018/05/30/technology/google-project-maven-pentagon.html
  11. a b c d e f g Shane, S., Wakabayashi, D. (2018, April 4). 'The Business of War': Google Employees Protest Work for the Pentagon. The New York Times. https://www.nytimes.com/2018/04/04/technology/google-letter-ceo-pentagon-project.html
  12. Chan, R. (2019, July 19). Here's what you need to know about Palantir, the secretive $20 billion data-analysis company whose work with ICE is dragging Amazon into controversy. Business Insider. https://www.businessinsider.com/palantir-ice-explainer-data-startup-2019-7
  13. Mattera, P. (2012, September 23). Raytheon. https://www.corp-research.org/raytheon
  14. Mattera, P. (2012, September 23). General Dynamics. https://www.corp-research.org/general-dynamics
  15. Conger, K. (2018, May 18). Google Removes 'Don't Be Evil' Clause From Its Code of Conduct. https://gizmodo.com/google-removes-nearly-all-mentions-of-dont-be-evil-from-1826153393
  16. al-Zikry, M., Michael, M. (2018, November 14). Hidden toll of US drone strikes in Yemen: Nearly a third of deaths are civilians, not al-Qaida. Military Times. https://www.militarytimes.com/news/your-military/2018/11/14/hidden-toll-of-us-drone-strikes-in-yemen-nearly-a-third-of-deaths-are-civilians-not-al-qaida/
  17. Ryan, M. (2019, March 6). Trump administration alters Obama-era bill on civilian casualties in U.S. airstrikes. The Washington Post. https://www.washingtonpost.com/world/national-security/white-house-weakens-obama-era-rule-on-civilian-casualties/2019/03/06/b2940dfe-4031-11e9-9361-301ffb5bd5e6_story.html
  18. Adams, L., Nasir, A. (2010, September 18). Inside the mind of the Times Square bomber. The Guardian. https://www.theguardian.com/world/2010/sep/19/times-square-bomber
  19. a b Greene, Tristan. (December 2019). Report: Palantir took over Project Maven, the military AI program too unethical for Google. From https://thenextweb.com/artificial-intelligence/2019/12/11/report-palantir-took-over-project-maven-the-military-ai-program-too-unethical-for-google/
  20. Sandler, Rachel. (July 15, 2019) Peter Thiel Says CIA Should Investigate Google For Being 'Treasonous'. From https://www.forbes.com/sites/rachelsandler/2019/07/15/peter-thiel-says-cia-should-investigate-google-for-being-treasonous/#67ac6ca3521d
  21. a b Rosalie Chan. (Oct 8th, 2018). From https://www.businessinsider.com/google-drops-out-of-10-billion-jedi-contract-bid-2018-10
  22. Google. (n.d.). Artificial Intelligence at Google: Our Principles. From https://ai.google/principles/