Summary Of The Book " An Ugly Truth" - By Sheera Frenkel and Cecilia Kang
Key Concepts in this Book:
- Zuckerberg has always prioritised engagement over ethics.
- Sandberg developed Facebook into a behemoth in the advertising world.
- Facebook attempted but failed to maintain its political neutrality.
- Facebook shied away from accepting blame for massive election involvement.
- The lack of content filtering on Facebook contributed to real-life violence.
- With its anti-competitive activities, Facebook made a lot of opponents.
- Facebook attempted but failed to portray itself as a proponent of free expression.
- Several crises have forced Facebook to reconsider its commitment to free speech.
- Internet addicts looking to understand cyberspace.
- Political junkies addicted to their News Feed.
- Anyone who has ever Logged on to social media.
What am I getting out of it? A candid look at a social media behemoth.
The stratospheric rise of Facebook was a sight to behold. This dorm-room idea grew from a little campus curiosity to a global social media powerhouse in just a decade. However, in recent years, the company's image has deteriorated. Privacy concerns, misinformation, and uneasy political ties have all been linked to the platform.
These blinks give an in-depth look into the complicated dynamics that turned Facebook into one of the most divisive companies on the planet. This storey delves into how and why the platform became involved in crisis after scandal, based on meticulous reporting. This current analysis, which is packed with stunning data and troubling realities, indicates that the social network may have been rotten from the start.
- You'll learn why Zuckerberg abandoned his first project, FaceMash.
- Why Facebook impacted life in Myanmar.
- And why the internet giant employs a "ratcatcher" in this summary.
8th of December, 2015. On Facebook, a new video has been posted. In the short footage, Donald Trump, one of many presidential candidates at the time, gives a fiery address. He yells about terrorists, immigrants, and then demands that all Muslims be barred from entering the United States.
The video goes viral in a matter of hours, having been shared 14,000 times and receiving over 100,000 likes. Trump's anti-Muslim rhetoric is considered hate speech by many Facebook employees, and it is an obvious violation of the site's terms and conditions. They'd like it taken down from the site.
Mark Zuckerberg, on the other hand, is not convinced. Following a meeting with Joel Kaplan, Facebook's vice president of public policy, Zuckerberg determined that the speech was too "newsworthy" to be removed. The video is still available to be shared.
The essential takeaway here is that Zuckerberg has always prioritised participation over ethics.
Zuckerberg's approach to social networking sparked debate even as a Harvard undergrad. FaceMash, his initial outlet, was a dormant blog where he ranked the attractiveness of his female classmates. It was well-liked, but not by everyone. Student organisations were so critical of the site that Zuckerberg decided to create a new, less frightening endeavour called Thefacebook.
This primitive, early version of what we now know as Facebook, launched in 2004, featured only a few functionalities. It allowed students to create personal pages, connect with other users, and send messages to one another. Regardless, it was a huge hit on college campuses. By 2005, the site had over a million users, with the majority of them logging in more than four times per day. The site's popularity motivated Zuckerberg to leave Harvard, relocate to Palo Alto, and devote his full attention to Facebook.
Facebook grew by leaps and bounds in its early years, and the site was heralded as Silicon Valley's next big thing. The buzz was so tremendous that Yahoo attempted to purchase the company for a billion dollars in 2006. The offer was turned down by Zuckerberg. Despite his shyness, awkwardness, and youth, he had big hopes for the business. Rather than focusing on revenues, he set his sights on expansion. He was always pushing his tiny team to make the site more engaging and amusing.
Facebook introduced the News Feed in September 2006. This new feature provided users with a consolidated hub that displayed all of their friends' activities. The Feed was initially hated because of the information overload and sudden lack of privacy. Facebook's stats, on the other hand, presented a different narrative. The Feed encouraged visitors to stay on the site longer and share more, which was exactly what Zuckerberg desired.
2. Sandberg developed Facebook into a behemoth in the advertising world.
The small conversation was never a favourite of Zuckerberg's. Despite this, he dared to attend a Yahoo colleague's Christmas party in December 2007. Zuckerberg, on the other hand, did not attend for the holiday happiness. No, he was there to meet Sheryl Sandberg, a woman he had never met before.
Sandberg has already established herself as a savvy businesswoman. Her outstanding resume included Harvard degrees and a stint at the World Bank. She was a VP at Google at the time, possibly Silicon Valley's most prestigious start-up.
The two chatted business for over an hour at the party. They met numerous more times in the weeks that followed. Sandberg was enthusiastically designated Facebook's new chief operating officer in March 2008.
The main point here is that Sandberg turned Facebook into a massive advertising powerhouse.
Sandberg was, in many ways, exactly what Facebook required. While Zuckerberg was preoccupied with improving the site's technology and user interface, he was less interested in other matters, such as earnings. Sandberg, on the other hand, was a businessman. She'd grown Google's modest advertising division into a multibillion-dollar business during her time there. She'd do the same thing with Facebook now.
Facebook, in Sandberg's opinion, was specially adapted to the world of online advertising. While Google sold advertising based on search phrases, Facebook has access to a much broader range of user information. The corporation might utilise this information to serve up customised adverts depending on user behaviour. Not only that, but the site's interactive aspect encouraged users to respond to adverts directly and share them with their friends.
Facebook took steps to effectively monetize its users' data to take advantage of these benefits. It launched the "like" button in 2009. This feature allowed users to quickly respond to anything that was posted on the site. The "likes" were then utilised by Facebook to deliver tailored content and, more critically, to collect user preferences to sell to advertisers. The site's privacy settings were also changed. Users were duped into revealing more information by the new options, which were opaque and misleading.
The Center for Digital Democracy, a privacy advocacy group, has taken notice of Facebook's escalating misuse of user data. It filed a complaint with the Federal Trade Commission (FTC) in December 2009. Following the filing, Facebook agreed to frequent privacy audits, but the government did little to actually monitor or oversee the company's behaviour in the following years.
3. Facebook attempted but failed to maintain its political neutrality.
Sonya Ahuja was a professional engineer. Unofficially, though, she was known as "the ratcatcher." Her job at Facebook was straightforward: whenever an unfavourable storey about the firm made the news, she had to track down and dismiss the person who leaked the information.
2016 was a busy year for the ratcatcher. A series of damaging pieces on Facebook's internal conflict was being published by Gizmodo, a renowned tech blog. According to the survey, as the election in the United States heated up, users' News Feeds became increasingly cluttered with bogus news and explosive hate speech. Employees at Facebook, according to Gizmodo, tried to stop this worrying trend.
The important takeaway is that Facebook attempted but failed to remain politically impartial.
By 2016, Facebook has become the principal source of news and information for millions of individuals in the United States and throughout the world. This was beneficial to the company's bottom line, but it came with drawbacks. For one thing, Facebook's News Feed algorithm promoted postings with high engagement, which were typically provocative and sensationalistic. Users frequently encountered content that validated their own prejudices, no matter how unfounded.
The corporation launched "Trending Topics" to change the tone of the site. This function allowed Facebook's content team to curate what appeared in users' news feeds to some extent. Gizmodo published a piece in May 2016 alleging that the site utilised this technology to censor postings with right-wing beliefs. The conservative media, as expected, went crazy over the news. Republicans had thought the site was biased against liberals, and this appeared to be proof.
To defuse the response, Zuckerberg met with prominent conservatives such as Blaze TV's Glenn Beck and the American Enterprise Institute's Arthur Brooks. The Facebook creator emphasised his commitment to free speech in these sessions, claiming that the company was doing everything it could to stay politically neutral. The ruse only worked in part, as conservatives remained suspicious and liberals grumbled about the appeasement.
Meanwhile, the threat intelligence team at Facebook has spotted an alarming new trend. Russian hackers appeared to be using the site to spread false material on Democratic candidates. Some hackers were also spreading emails and other information taken from the Democratic National Committee, which wasn't technically against the site's guidelines. The team was able to take down some of the rogue accounts, but the harm had already been done: several of the posts had gone viral.
4. Facebook shied away from accepting blame for massive election involvement.
Much of the country was taken aback by Donald Trump's improbable election triumph, including Facebook. In the days after the election, Zuckerberg and his team struggled to adjust to the new reality.
For starters, the business had to deal with a new, potentially unfriendly administration. So, to strengthen ties with the incoming president, the corporation recruited Corey Lewandowski, Trump's former campaign manager, as a consultant.
Working with Trump was a strange and unappealing prospect for many of his workers. However, in the months ahead, and even more unsettling concept would emerge. As Facebook's investigation into its election handling progressed, it became evident that the firm may have aided in his election victory.
The main point to take up from this is that Facebook avoided taking the blame for massive election involvement.
Alex Stamos, Facebook's chief of cybersecurity, initiated Project P in the months after the election. The goal of this internal investigation was to see if the site had been utilised as a vehicle of political propaganda by outside parties. Thousands of political advertising were purchased throughout the election, and Stamos' staff combed through them. They were on the lookout for any patterns that could point to coordinated political campaigns. The Internet Research Agency, or IRA, was the result of their quest.
The IRA, based in St. Petersburg, is a group dedicated to promoting Russian political interests. It spent more than $100,000 on Facebook advertisements in 2016 to promote radical positions on both the left and right. The ads were well-received and shared, reaching a total of 126 million Americans. This operation, along with others like it, is likely to have influenced the election's news cycle and outcome.
Facebook initially tried to minimise the results, citing a desire to avoid becoming mired in a political crisis. Nonetheless, in March 2018, a scandal became out. According to the New York Times, Cambridge Analytica, a UK consulting firm, leveraged a security flaw to steal data from up to 87 million Facebook users. The data was subsequently sold to the Trump team, who used it to create targeted political advertising.
Concerns over Facebook's privacy policies were revived as a result of the revelation. The stock price of the business plummeted 10%, and Zuckerberg was summoned to speak before Congress. Nonetheless, many congresspeople appeared to be technologically clueless during the session, and Zuckerberg successfully avoided making any incriminating admissions. The stock price of Facebook had rebounded by the end of the day, and it appeared that the business would dodge any meaningful accountability.
5. The lack of content filtering on Facebook contributed to real-life violence.
It is August of this year. Sai Sitt Thway Aung, a Burmese soldier of the 99th Light Infantry Division, turns on his phone and gets into Facebook. He carefully composes a new post. It expresses his hatred for Muslims and his determination to expel them from his homeland.
Aung isn't the only one who feels this way. People all around Myanmar are posting and sharing harsh sentiments on Myanmar's Muslim minority, the Rohingya. Hatred quickly escalates into violence. Over 24,000 Rohingya are slaughtered in the following months, with hundreds of thousands more fleeing to Bangladesh as refugees.
A United Nations fact-finding team investigates the conflict later. They believe Facebook played a “decisive role” in turning racial tensions in the region into a full-fledged genocide.
The main takeaway is that Facebook's inadequate content control contributed to real-world violence.
By August 2013, Facebook has surpassed one billion users, a remarkable achievement. Nonetheless, Zuckerberg had loftier ambitions. He wanted his platform to be able to accommodate billions of more people. To achieve this level of success, the corporation had to expand beyond the wealthy Western nations where it was already well-known. As a result, Zuckerberg established the "Next One Billion," an initiative aimed at bringing internet access and Facebook to underdeveloped nations.
While the effort was successful in attracting new Facebook members, it had unforeseen repercussions. The corporation has no idea how the platform will be used in different situations. It also didn't recruit enough people to censor the site in all of the new languages that it was being utilised in. As a result, the corporation was caught off guard when extremist anti-Muslim Buddhists began circulating anti-Rohingya sentiments — at least initially.
Human rights activists warned Facebook about the dangers of hate speech on its platform as early as 2014. Matt Schissler, a Myanmar activist, even travelled to the company's offices to warn employees about the rising threats of violence. His pleadings, however, went unheeded. Despite his efforts, Facebook did little to quell the hostility, and the Rohingya people were the ones who suffered as a result.
Facebook's popularity was beginning to diminish as a result of this avoidable tragedy, as well as the Cambridge Analytica scandal and other high-profile errors. Facebook was no longer the sexiest firm in Silicon Valley, and new talent was looking for jobs elsewhere. As a result, in July 2018, Zuckerberg published a statement. He'd become the company's "Wartime CEO" and have a larger role in day-to-day operations. He planned to try to get his company back on track in the coming year.
6. With its anti-competitive activities, Facebook made a lot of opponents.
By May of this year, Zuckerberg had grown accustomed to negative press attention. A new op-ed in the New York Times, despite his thick skin, seemed like a stab in the back. Chris Hughes, one of Facebook's early cofounders, wrote it. "It's Time to Break Up Facebook," the headline reads.
Hughes had founded the Economic Security Project, a leftist think tank, after leaving Facebook a decade before. He contended in his article that Facebook had become too big and too fast. He argued the firm stifled competition, mishandled consumer data, and functioned as a dangerous monopoly in general.
Zuckerberg has reason to be concerned. Hughes is the latest in a long line of lawmakers, academics, and consumer activists who have called on the government to shut down Facebook.
The takeaway here is that Facebook's anticompetitive activities have earned it a lot of adversaries.
Facebook's rapid expansion was aided in part by a consistent policy of acquiring smaller competitors. Hughes' op-ed was written after the corporation had bought over 70 other businesses. The majority of these acquisitions were under $100 million, but there were several large mergers as well. Facebook paid $1 billion for the photo app Instagram in 2012. It paid $19 billion for the WhatsApp messaging software in 2014.
As a result of these purchases, Facebook now boasts a global user base of 2.5 billion people. It also provided the corporation with an enormous amount of consumer data. Despite Zuckerberg's initial vow to give each operation some autonomy, the services were eventually merged. While each app would appear to be independent, its back-end technology would be very interconnected.
The deal, according to law professors Tim Wu and Scott Hemphill, was just a tactic to make antitrust enforcement more difficult. By combining the services, Facebook may argue that any government-mandated separation would be too difficult. Nonetheless, the movie had no effect on the political environment. As the 2020 election neared, Democrats such as Elizabeth Warren and Bernie Sanders made Facebook regulation a hot topic.
Meanwhile, Facebook kept making political gaffes. Despite Zuckerberg and Sandberg's boasts about new moderation methods, the site remained in trouble. Throughout 2019, the site was flooded with problematic deep fake videos, including one of House Speaker Nancy Pelosi, a longtime Silicon Valley supporter. Pelosi went on Facebook after Zuckerberg refused to remove the video. In DC, Facebook suddenly had relatively few friends.
7. Facebook attempted but failed to portray itself as a proponent of free expression.
Zuckerberg had a busy summer in 2019. The CEO spent months rubbing shoulders with political elites at the request of Joel Kaplan, Nick Clegg, and Facebook's other public policy gurus. He visited with Republican Senator Lindsay Graham, right-wing commentator Tucker Carlson, and, lastly, President Trump.
The encounter with Trump, which took place over Diet Cokes in the Oval Office, went off without a hitch. Zuckerberg praised Trump's excellent social media presence. Trump, in turn, warmed up to Zuckerberg. He even celebrated the occasion with a Tweet thereafter.
While many Facebook employees were irritated by the contentious meeting, Zuckerberg appeared unconcerned. After all, he needed pals in high places if he was going to defend Facebook's future.
The main takeaway is that Facebook failed to recast itself as a champion of free expression.
Zuckerberg sought to redirect the debate about Facebook by meeting Trump. He wanted the president, as well as Republicans in general, to regard his company as a benefit to the country. As a result, he frequently stated in meetings with legislators that companies like his function as a key bulwark against expanding competition from Chinese competitors such as WeChat and TikTok.
However, this was not Zuckerberg's only plan. He also intended to turn Facebook's slack moderation into a strength. Facebook announced in the run-up to the 2020 election season that it would not fact-check or filter any political advertising. Politicians, advocacy groups, and ordinary citizens were quick to criticise the new policy, fearful that misinformation would sabotage yet another election.
To deflect criticism, Zuckerberg gave a high-profile speech on the campus of Georgetown University in Washington, DC. The CEO praised Facebook's long-standing commitment to free speech in his remarks. He made odd analogies between posting online and the Civil Rights Movement, erroneously claiming that the site was formed to foster political conversation around the 2003 Iraq War.
Almost everyone, including the Anti-Defamation League and prominent politician Alexandria Ocasio-Cortez, condemned the event. In the following interview with journalist Katie Couric, Sandberg was obliged to defend Zuckerberg's new viewpoint. Despite her best efforts, she was unable to persuade Facebook to change its rules. People were sick of the company's harmful impacts after years of blunders - and more experiments were on the way.
8. Several crises have forced Facebook to reconsider its commitment to free speech.
President Trump is at it again, blustering his way through another press conference from the podium. This time, it's April 2020, and COVID-19 is sweeping the globe. In a trademark digression, the president posits that drinking household disinfectants could heal the illness.
Of course, the video surfaces on Trump's Facebook page within minutes. The post technically goes against the site's policy against medical misrepresentation. The corporation has been more diligent about suppressing hazardous rumours since the outbreak began. But how should it deal with this? Should the President's post be removed?
Zuckerberg and his colleagues have erred on the side of free speech again. The post is still live, with all of its errors.
The main point is that many crises are forcing Facebook to reconsider its free speech absolutism.
The spring of 2020 will be remembered as a watershed moment in the history of online content control. The COVID-19 pandemic, as well as widespread protests in the aftermath of George Floyd's death, sparked a fresh round of harsh discourse, to which platforms were beginning to respond. Trump suggested that protesters should be shot in a Tweet in May. Twitter, in an unusual move, declared the message to be harmful.
Despite this, Zuckerberg has steadfastly refused to take any action. Employees at Facebook were furious. A digital walkout was organised by many, and 33 founding employees signed an open letter denouncing the corporation. Advertisers were fed up with the bad coverage by June, and large corporations such as Verizon, Starbucks, and Ford mounted a one-month boycott of the site.
Trump wasn't the only thorn in the side. Also causing turmoil was Facebook's new private group feature. These unmoderated hotspots had devolved into breeding grounds for hate speech, conspiracy theories, and right-wing militia groups. Then, on January 6, a tumultuous mob rushed the Capitol in an attempt to reverse the election results. Many participants spent a lot of time on Facebook and coordinating on unmoderated group sites, according to the media.
Finally, the corporation decided to take a different path. Facebook implemented a tougher stance against hazardous posts, and Trump's account was suspended for several weeks. The Facebook Oversight Board, a new, impartial group to rule on content complaints, was even established by the business. The board will ostensibly give extra oversight, but critics claim it is simply an effort to shift accountability away from the company's management. The future of Facebook is still up in the air.
The important message in this summary is that Facebook has always courted controversy since its inception. The site became an excellent amplifier for extremist beliefs and political unrest thanks to Mark Zuckerberg's concentration on constant development, addicting user engagement algorithms, and loose moderation practices. While the corporation has taken steps to rein in its worst excesses, there is still a lot of work to be done. The question of whether the social media behemoth will improve or fade away remains unanswered.
#books #bookstagram #book #booklover #reading #bookworm #bookstagrammer #read #bookish #booknerd #bookaddict #booksofinstagram #bibliophile #love #instabook #bookshelf #bookaholic #booksbooksbooks #readersofinstagram #libros #reader #bookphotography #booklove #b #art #author #instabooks #literature #libri #bhfyp
#writer #bookcommunity #bookblogger #quotes #bookreview #library #livros #writing #novel #poetry #writersofinstagram #igreads #libro #readingtime #bookstore #bookclub #romance #goodreads #instagood #fantasy #literatura #instagram #fiction #photography #authorsofinstagram #a #bookobsessed #kindle #life #leggere
Comments
Post a Comment