Facebook Whistleblower Frances Haugen testifies before Senate Commerce Committee

Facebook Whistleblower Frances Haugen testifies before Senate Commerce Committee

SUBTITLE'S INFO:

Language: English

Type: Robot

Number of phrases: 5333

Number of words: 29128

Number of symbols: 138613

DOWNLOAD SUBTITLES:

DOWNLOAD AUDIO AND VIDEO:

SUBTITLES:

Subtitles generated by robot
00:01
[Music] uh uh oh oh teachers hello hey sir um wow nice thank you hello good as well my me versus the meeting and hearing of the
08:23
subcommittee on consumer protection of the commerce committee will come to order i'm very pleased to welcome my colleagues and i want to thank ranking member senator blackburn for her cooperation and collaboration we've been working very closely and the ranking member who is here senator wicker as well as our chairwoman maria cantwell senator cantwell i'm sure will be here shortly uh most important
08:54
i'd like to thank our witness francis haugen for being here and the two council who are representing her today and i want to give you my heartfelt gratitude for your courage and strength in coming forward as you have done standing up to one of the most powerful implacable corporate giants in the history of the world without any exaggeration you have a compelling
09:25
credible voice which we've heard already but you are not here alone you're armed with documents and evidence and you speak volumes as they do about how facebook has put profits ahead of people among other revelations the information that you have provided to congress is powerful proof that facebook knew
09:56
its products were harming teenagers facebook exploited teens using powerful algorithms that amplified their insecurities and abuses through what it found was an addict's narrative there is a question which i hope you will discuss as to whether there is such a thing as a safe algorithm facebook saw teens creating secret accounts
10:27
that are often hidden from their parents as unique value proposition in their words a unique value proposition a way to drive up numbers for advertisers and shareholders at the expense of safety and it doubled down on targeting children pushing products on pre-teens not just teens but pre-teens that it knows are harmful to our kids mental health and well-being
10:58
instead of telling parents facebook concealed the facts it sought to stonewall and block this information from becoming public including to this committee when senator blackburn and i specifically asked the company and still even now as of just last thursday when a facebook witness came for this committee it has refused disclosure or even to tell us when it might decide whether to disclose additional documents
11:30
and they've continued their tactics even after they knew the destruction it caused it isn't just that they made money from these practices but they continued to profit from them their profit was more important than the pain that they caused last thursday the message from ms antigone davis facebook's global head of safety was simple quote this research is not a bombshell
12:02
end quote and she repeated the line not a bombshell well this research is the very definition of a bombshell facebook and big tech are facing a big tobacco omen a moment of reckoning the parallel is striking i sued big tobacco as connecticut's attorney general i helped to lead the states in that legal action and i remember very very well
12:34
the moment in the course of our litigation when we learned of those files that showed not only that big tobacco knew that its product caused cancer but that they had done the research they concealed the files and now we knew and the world knew and big tech now faces that big tobacco jaw-dropping moment of truth it is
13:03
documented proof that facebook knows its products can be addictive and toxic to children and it's not just that they made money again it's that they valued their profit more than the pain that they cause to children and their families the damage to self-interest and self-worth inflicted by facebook today will haunt a generation feelings of inadequacy and insecurity
13:38
rejection and self-hatred will impact this generation for years to come our children are the ones who are victims teens today looking at themselves in the mirror feel doubt and insecurity mark zuckerberg ought to be looking at himself in the mirror today and yet rather than taking responsibility and showing leadership
14:11
mr zuckerberg is going sailing his new modus operandi no apologies no admission no action nothing to see here mark zuckerberg you need to come before this committee you need to explain to francis haugen to us to the world and to the parents of america what you were doing and why you did it instagram's business model is pretty straightforward more eyeballs more
14:43
dollars everything facebook does is to add more users and keep them on their apps for longer in order to hook us instagram uses our private information to precisely target us with content and recommendations assessing that what will provoke a reaction will keep us scrolling far too often these recommendations encourage our most destructive and dangerous
15:15
behaviors as we showed on thursday we created a fake account my office and i did as a teen interested in extreme dieting and eating disorders instagram latched on to that teenager's initial insecurities it then pushed more content and recommendations glorifying eating disorders that's how instagram's algorithms can push teens into darker and darker
15:47
places facebook's own researchers called it instagram's quote perfect storm exacerbating downward spirals facebook as you have put it is how good so powerfully maximizes profits and ignores pain facebook's failure to acknowledge and to act makes it morally bankrupt again and again facebook rejected reforms recommended by its own researchers
16:19
last week ms davis said quote we're looking at end quote no specific plans no commitments only vague platitudes these documents that you have revealed provided this company with a blueprint for reform provided specific recommendation that could have made facebook and instagram safer the company repeatedly ignored those recommendations from its own
16:51
researchers that would have made facebook and instagram safer faith facebook researchers have suggested changing their recommendations to stop promoting accounts known to encourage dangerous body comparison instead of making meaningful changes facebook simply pays lip service and if they won't act
17:21
and if big tech won't act congress has to intervene privacy protection is long overdue senator markey and i have introduced the kids act which would ban addictive tactics that facebook uses to exploit children parents deserve better tools to protect their children i'm also a firm supporter of reforming section 230. we should consider narrowing this sweeping immunity when platforms algorithms
17:52
amplify illegal conduct you've commented on this in your testimony and perhaps you'll expand on it we have also heard compelling recommendations about requiring disclosures of research and independent reviews of these platforms algorithms and i plan to pursue these ideas the securities and exchange commission should investigate your contentions and claims ms haugen and so should the federal trade commission facebook appears to have
18:24
misled the public and investors and if that's correct it ought to face real penalties as a result of that misleading and deceptive misrepresentation i want to thank all my colleagues who are here today because what we have is a bipartisan congressional roadmap for reform that will safeguard and protect children from big tech that will be a focus of our subcommittee moving forward and it will continue to
18:59
continue to be bipartisan and finally i'll just end on this note in the past weeks and days parents have contacted me with their stories heartbreaking and spine chilling stories about children pushed into eating disorders bullying online self-injury of the most disturbing kind and sometimes even taking their lives
19:30
because of social media parents are holding facebook accountable because of your bravery is happen and we need to hold accountable facebook and all big tech as well again my thanks to you i am going to enter into the record a letter from 52 state attorneys general and from two members of the youth advisory board of sandy hook promise as long as there's no objection and i will now turn to the ranking
20:01
member center platform thank you mr chairman and thank you for entering that letter in the record that we have from our state's attorneys general good morning to everyone it is nice to see people in this hearing room and uh to be here for the hearing today miss haugen we thank you for your appearance before us today and for giving the opportunity not only for congress but for uh for the american people to hear from you in this setting and we
20:34
appreciate that mr chairman i think also thanks to you and your staff that have worked with our team to make certain that we had this hearing and this opportunity today so that we can get more insights into what facebook is actually doing as they invade the privacy not only of adults but of children and look at the ways that they are in violation of the children's online privacy
21:04
protection act which is federal law and looking at how they are evading that law and working around it and as the chairman said a privacy and online privacy passing a federal privacy standard has been long in the works i filed my first privacy bill when i was in the house back in 2012 and i think that it will be this congress and this subcommittee that is
21:36
going to lead the way to online privacy data security section 230 reforms and of course senator klobuchar always wants to talk about antitrust and i have to give a nod senator markey is down there when we were in the house we were probably two of the only ones who were talking about the need to have a federal privacy standard now as the chairman mentioned last week
22:06
we heard from mrs davis who had global safety for facebook and it was surprising to us that what she tried to do was to minimize the information that was in these documents to minimize the research and to minimize the knowledge that facebook had at one point i even reminded her the research was not third party research
22:36
the research was there facebook's internal research so they knew what they were doing they knew where the violations were and they know they are guilty they know this their research tells them this last week in advance of our hearing facebook released two studies and said that the wall street journal was all
23:10
wrong they had just gotten it wrong as if the wall street journal did not know how to read these documents and how to work through this research having seen the data that you've presented and the other studies that facebook did not publicly share i feel pretty confident that it's facebook who has done the misrepresenting to this committee here are some of the
23:42
numbers that facebook chose not to share and mr chairman i think it's important that we look at these as we talk about the setting for this hearing what we learned last week what you and i have been learning over the past three years about big tech and facebook and here you go 66 percent of teen girls on instagram and 40 percent of teen boys experience negative social comparisons this is facebook's research 52 percent
24:15
of teen girls who experienced negative social comparison on instagram said it was caused by images related to beauty social comparison is worse on instagram because it is perceived as real life but based on celebrity standards social comparison mimics the grief cycle and includes a downward emotional spiral encompassing a range of emotions from jealousy to self-proclaimed
24:46
body dysmorphia facebook addiction which facebook calls conveniently problematic use is most severe in teens peaking at age 14. here's what else we know facebook is not interested in making significant changes to improve kids safety on their platforms at least not when that would result in losing eyeballs on post or
25:17
decreasing their ad revenues in fact facebook is running scared as they know that in their own words young adults are less active and less engaged on facebook and that they are running out of teams to add to instagram so teens are looking at other platforms like tick tock and facebook is only making those changes that add to its users numbers and ultimately its profits follow the money
25:50
so what are these changes allowing users to create multiple accounts that facebook does not delete and encouraging teens to create second accounts they can hide from their parents they are also studying younger and younger children as young as eight so that they can market to them and while miss davis says that kids below 13 are not allowed on facebook or instagram we know that they are because she told us
26:22
that they recently had deleted 600 000 accounts from children under age 13. so how do you get that many underage accounts if you aren't turning a blind eye to them in the first place and then in order to try to clean it up you go to delete it and then you say oh by the way we just in the last month deleted 600 000 underage accounts
26:54
and speaking of turning a blind eye facebook turns a blind eye to user privacy news broke yesterday that the private data of over 1.5 billion that's right 1.5 billion facebook users is being sold on a hacking forum that's its biggest data breach to date examples like this underscore my strong concerns about facebook collecting the data of kids and teens and what they are
27:25
doing with it facebook also turns a blind eye toward blatant human exploitation taking place on its platform trafficking forced labor cartels the worst possible things one can imagine big tech companies have gotten away with abusing consumers for too long it is clear that facebook prioritizes profit over the well-being of children and all users so as a mother and a grandmother
27:59
this is an issue that is of particular concern to me so we thank you for being here today miss haugen and we look forward to getting to the truth about what facebook is doing with users data and how they are abusing their privacy and how they show a lack of respect for the individuals that are on their network we look forward to the testimony thank you mr chairman thanks senator black
28:30
thank you senator blackman i don't know whether the ranking member would like to make this if you don't mind i thank you um chairman blumenthal and and i will just take a moment or two uh and and i do appreciate being able to to speak as ranking member of the full committee this this miss halligan this is a this is a subcommittee hearing you see some vacant seats uh there's pretty good attendance for a subcommittee there are also a lot of things going on so people will be coming and going but
29:00
i'm i'm willing to predict that this will have almost 100 percent attendance by members of the subcommittee because of the importance of this subject matter so thanks for coming forward to share concerns about facebook's business practices particularly with respect to children and teens and of course that is the the main topic of our it's the title of our hearing today protecting kids online the recent relevant of revelations about
29:32
facebook's mental health effects on children and its plan to target younger audiences are indeed disturbing um and i think you're going to see a lot of bipartisan concern about this today and and in future hearings um they just they show how urgent it is for congress to act against powerful tech companies on behalf of children and the broader public
30:02
and i say powerful tech companies they are possessing possessive of immense immense power their product is addictive and people on both sides of this diocese are concerned about this i talked to an opinion maker um just down the hall a few moments before this hearing this person said the um tech gods have been demystified now and i think this hearing
30:35
today mr chair is a part of the process of demystifying big tech the children of america are hooked on their product it is often destructive and harmful and there is a cynical knowledge on behalf of the leadership of these big tech companies that that is true ms hogan i i hope you will have a chance to talk about your work experience at facebook
31:08
and perhaps compare it to other social media companies i also look forward to hearing your thoughts on how this committee and how this congress can ensure greater accountability and transparency especially with regard to children so thank you mr chairman and thank you ms hogan for being here today thanks senator wicker our witness this morning is frances haugen she was the lead product manager on facebook's civic misinformation team she
31:39
holds a degree in electrical and computer engineering from olin college and an mba from harvard she made the courageous decision as all of us here and many others around the world know to leave facebook and reveal the terrible truths about the company she learned during her tenure there and i think we are all in agreement here in expressing our gratitude and our admiration for
32:10
your bravery and coming forward thank you miss haugen please proceed good afternoon chairman blumenthal ranking member blackburn and members of the subcommittee thank you for the opportunity to appear before you my name is francis haugen i used to work at facebook i joined facebook because i think facebook has the potential to bring out the best in us but i'm here today because i believe facebook's products harm children stoke division and weaken our democracy
32:48
the company's leadership knows how to make facebook and instagram safer but won't make the necessary changes because they have put their astronomical profits before people congressional action is needed they won't solve this crisis without your help yesterday we saw facebook get taken off the internet i don't know why i went down but i know that for more than five hours facebook wasn't used to deepen divides destabilize democracies and make young
33:18
girls and women feel bad about their bodies it also means that millions of small businesses weren't able to reach potential customers and countless photos of new babies weren't joyously celebrated by family and friends around the world i believe in the potential of facebook we can have social media we enjoy that connects us without tearing our democracy a part of democracy putting our children in danger and sowing ethnic violence around the world
33:48
we can do better i have worked as a product manager at large tech companies since 2006 including google pinterest yelp and facebook my job has largely focused on algorithmic products like google plus search and recommendation systems like the one that powers the facebook news feed having worked on four different types of social networks i understand how complex and nuanced these problems are however the choices being made inside of facebook are disastrous
34:20
for our children for our public safety for our privacy and for our democracy and that is why we must demand facebook make changes during my time at facebook first working as the lead product manager for civic misinformation and later on counter espionage i saw facebook repeatedly encounter conflicts between its own profits and our safety facebook consistently resolved these conflicts in favor of its own profits the result has been more division more harm
34:52
more lies more threats and more combat in some cases this is this dangerous online talk has led to actual violence that harms and even kills people this is not simply a matter of certain social media users being angry or unstable or about one side being radicalized against the other it is about facebook choosing to grow at all costs becoming an almost trillion dollar company by buying its profits with our safety during my time at facebook i came to
35:23
realize a devastating truth almost no one outside of facebook knows what happens inside of facebook the company intentionally hides vital information from the public from the us government and from governments around the world the documents i have provided to congress prove that facebook has repeatedly misled the public about what its own research reveals about the safety of children the efficacy of its artificial intelligence systems and its role in spreading divisive and extreme messages
35:54
i came forward because i believe that every human being deserves the dignity of the truth the severity of this crisis demands that we break out of our previous regulatory frames facebook wants to trick you into thinking that privacy protections or changes to section 230 alone will be sufficient while important these will not get to the core of the issue which is that no one truly understands the destructive choices made by facebook except facebook we can afford nothing less than full
36:25
transparency as long as facebook is operating in the shadows hiding its research from public scrutiny it is unaccountable until the incentives change facebook will not change left alone facebook will continue to make choices that go against the common good our common good when we realized big tobacco was hiding the harms it caused the government took action when we figured out cars were safer with seat belts the government took action
36:58
and when our government learned that opioids were taking lives the government took action i implore you to do the same here today facebook shapes our perception of the world by choosing the information we see even those who don't use facebook are impacted by the majority who do a company with such frightening influence over so many people over their deepest thoughts feelings and behavior needs real oversight but facebook's closed design means it has no real oversight
37:30
only facebook knows how it personalizes your feed for you at other large tech companies like google any independent researcher can download from the internet the company's search results and write papers about what they find and they do but facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system facebook will tell you privacy means they can't give you data this is not true when tobacco companies claimed that
38:00
filtered cigarettes were safer for consumers scientists could independently invalidate these marketing messages and confirm that in fact they posed a greater threat to human health the public cannot do the same with facebook we are given no other option than to take their marketing messages on blind faith not only does the company hide most of its own data my disclosure has proved that when facebook is directly asked questions as important as how do you impact the
38:31
health and safety of our children they mislead and they they choose to mislead and misdirect facebook has not earned our blind faith this inability to see into facebook's actual systems and confirm how they work is communicate it and work as and confirm that they work as communicated is like the department of transportation regulating cars by only watching them drive down the highway today no regulator has a menu of solutions for how to fix facebook because facebook didn't want them to
39:04
know enough about what's causing the problems otherwise they wouldn't otherwise there wouldn't have been need for a whistleblower how is the public supposed to assess if facebook is resolving conflicts of interest in a way that is aligned with the public good if the public has no visibility into how facebook operates this must change facebook wants you to believe that the problems we're talking about are unsolvable they want you to believe in false choices they want you to believe that you must
39:35
choose between a facebook full of divisive and extreme content or losing one of the most important values our country was founded upon free speech that you must choose between public oversight of facebook's choices and your personal privacy that to be able to share fun photos of your kids with old friends you must also be inundated with anger driven virality they want you to believe that this is just part of the deal i am here today to tell you that's not
40:06
true these problems are solvable a safer free speech respecting more enjoyable social media is possible but there is one thing that i hope everyone takes away from these disclosures it is that facebook can change but is clearly not going to do so on its own my fear is that without action divisive and extremist behaviors we see today are only the beginning what we saw in myanmar and are now seen in ethiopia are only the opening
40:38
chapters of a story so terrifying no one wants to read the end of it congress can change the rules that facebook plays by and stop the many harms it is now causing we now know the truth about facebook's destructive impact i really appreciate the seriousness which the members of congress and the securities and exchange commission are approaching these issues i came forward at great personal risk because i believe we still have time to act
41:09
but we must act now i'm asking you our elected representatives to act thank you thank you miss haugen thank you for taking that personal risk and we will do anything and everything to protect and stop any retaliation against you and any legal action that the company may bring to bear or anyone else and we've made that i think very clear in the course of these proceedings i want to ask you about this idea of
41:42
disclosure you've talked about looking in effect at a car going down the road and we're going to have five-minute rounds of questions maybe a second round if you were willing to do it we're here today to look under the hood and that's what we need to do more in august senator blackburn and i wrote to mark zuckerberg and we asked him pretty straightforward questions about how the company works and safeguards children and teens
42:15
on instagram facebook dots duff sidetracked in effect misled us so i'm going to ask you a few straight forward questions to break down some of what you have said and if you can answer them yes or no that would be great as facebook's research its own research ever found that its platforms can have a negative effect on children and teens mental health or well-being
42:48
many of facebook's internal research reports indicate that uh facebook has a serious negative harm on a non-significant not a significant portion of teenagers and younger and children and has facebook ever offered features that it knew had a negative effect on children's and teens mental health facebook knows that it's amplification algorithms things like engagement based ranking on instagram can lead children
43:18
from very innocuous topics like healthy recipes i think all of us could eat a little more healthy all the way from just something innocent like healthy recipes to anorexia promoting content over a very short period of time and has facebook ever found again in its research that kids show sign of addiction on instagram facebook has studied a pattern that they call problematic use what we might more commonly call addiction it has a very high bar for what it
43:50
believes it is it says you you self-identify that you don't have control over your usage and that it is materially harming your health your schoolwork or your your physical health five to six percent of 14 year olds have the self-awareness to admit both those questions it is likely that far more than five to six percent of 14 year olds are lit are even are addicted to instagram last thursday uh my colleagues and i asked ms davis who was representing
44:20
facebook about how the decision would be made whether to pause permanently instagram for kids and she said quote there's no one person who makes a decision like that we think about it that collaboratively it's as though she couldn't mention mark zuckerberg's name isn't he the one who will be making this decision from your experience in the company mark holds a very unique role in on the
44:51
tech industry in that he holds uh over 55 of all the voting shares for facebook um there are no similarly powerful companies that are as unilaterally controlled and in the end the buck stops with mark there is no one currently holding mark accountable but himself and mark zuckerberg in effect is the algorithm designer in chief correct um i received an mba from harvard and they emphasized to us that we are responsible for the organizations that we build
45:24
mark has built an organization that is very metrics driven that isn't it is intended to be flat there is no unilateral responsibility the metrics make the decision unfortunately that itself is a decision and in the end if he is the ceo and the chairman of facebook he is responsible for those decisions the buck stops with him the buck stops with him uh and speaking of the buck stopping uh you have said that
45:54
facebook should declare moral bankruptcy i agree i think it's its actions and its failure to acknowledge its responsibility indicate moral bankruptcy there is a cycle occurring inside the company where facebook has struggled for a long time to recruit and retain the number of employees it needs to tackle the large scope of projects that is chosen to take on facebook is stuck in a cycle where it struggles to struggles to hire that causes it to understaff projects
46:26
which causes scandals which then makes it harder to hire part of why facebook needs to come out and say we did something wrong we made some choices that we regret is the only way we can move forward and heal facebook is we first have to admit the truth like the way we'll have reconciliation and we can move forward is by first being honest and declaring moral bankruptcy being honest and acknowledging that facebook has caused an aggravated a lot of pain simply make more money and it has
46:57
profited off spreading disinformation and misinformation and sowing hate faint facebook's answers to facebook's destructive impact always seems to be more facebook we need more facebook which means more pain and more money for facebook would you agree i don't think at any point facebook set out to make a destructive platform i think it is a challenge of that facebook has set up an organization
47:30
where the parts of the organization responsible for growing and expanding the organization are separate and not not regularly cross-pollinated with the parts of the company that focus on the harms that the company is causing and as a result regularly integrity actions projects that were hard fought by the teams trying to keep us safe are undone by new growth projects that counteract those same remedies so i do think it's a thing of there are organizational problems that need oversight and facebook needs
48:00
help in order to move forward to a more healthy place and whether it's teens bullied into suicidal thoughts or the genocide of ethnic minorities in myanmar or fanning the flames of division within our own country or in europe they are ultimately responsible for the immorality of the pain that's caused facebook needs to take responsibility
48:30
for the consequences of its choices it needs to be willing to accept small trade-offs on profit and i think i think just that act of being able to admit that it's a mixed bag is important and i think what we saw from antigone last week is an example of the kind of behavior we need to support facebook and growing out of which is instead of just focusing on all the good they do admit they have responsibilities to also remedy the harm but mark zuckerberg's new policy is no apologies
49:01
no admissions no acknowledgement nothing to see here we're going to deflect it and go sailing i turn to the ranking member thank you mr chairman thank you for your testimony i want to stay with ms davis in some of her comments because i had asked her last week about the underage users and she had made the comment i'm going to quote from her testimony if we find an account of someone who's under 13 we
49:34
remove them in the last three months we removed 600 000 accounts of under 13 euros end quote and i have to tell you it seems to me that there's a problem if you have 600 000 accounts from children who ought not to be there in the first place so what did mark zuckerberg know about facebook's plans to bring kids on as new users and advertise to them
50:08
there are reports within facebook that show cohort analyses where they examine at what ages do people join facebook and instagram and based on those cohort analyses so facebook likes to say children lie about their ages to get onto the platform the reality is enough kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone who's on the platform when facebook does cohort analyses and
50:38
looks back retrospectively and discovers things like you know up to 10 to 15 percent of even 10 year olds in a given cohort maybe on facebook or instagram um okay so this is why adam masseri who's the ceo of instagram would have replied to jojo siwa when she said to him oh i've been on instagram since i was eight he said he didn't want to know that so it would be for this reason correct um
51:09
a pattern of behavior that i saw at facebook was that often problems were so understaffed that there was a kind of an implicit discouragement from having better detection systems so for example i worked my last team at facebook was on the counter espionage team within the threat intelligence org and at any given time our team could only handle a third of the cases that we knew about we knew that if we built even a basic detector we would likely have many more cases okay then similarly
51:41
yeah let me ask you this so you look at the way that they have the data but they're choosing to keep that data and advertise from it right you sell it to third parties so what does facebook do you've got these six hundred thousand accounts that ought not to be on there anymore and right but then you delete those accounts but what happens to that data does facebook
52:12
keep that data do they keep it until those children go to age 13 since as you're saying they can work backward and figure out the true age of a a user so what do they do with it do they delete it do they store it do they keep it how do they process that um i am my understanding of facebook's data retention policies and i want to be really clear i didn't work directly on that is that they delete when they
52:42
delete an account they delete all the data then i believe 90k is in compliance with gdpr um and with regard to children underage on the platform facebook could do substantially more to detect more of those children and they should have to publish for congress those processes because there are lots of subtleties and those things and they can be much more effective than probably what they're doing today got it now staying with this underage children since this hearing is all about kids and about online privacy i want you to tell
53:15
me how facebook is able to do market research on these children that are under age 13 because mr davis was really um she didn't deny this last week so how are they doing this do they uh bring kids into focus groups with their parents how do they get that permission she said they got permission from parents is there a permission slip
53:46
or a form that gets assigned and then how do they know which kids to target um there's a bunch to unpack there uh we'll start with maybe how did they recruit children for focus groups or recruit teenagers most tech companies have systems where they can analyze the data that is on their servers so most of the focus groups i read or that i saw analysis of were around
54:17
messenger kids which has children on it and those focus groups appear to be children interacting in person um often large tech companies use either sourcing agencies that will go and identify people who meet certain demographic criteria or they will reach out directly based on uh data on the platform so for example on the case of messenger kids maybe you would want to study a child that was an active user and one that was a less active user you might reach out to some that came from each population and so these are
54:49
children that are under age 13. yeah and they know it um for for some of these studies and i assume they get i assume they get permission but i don't work on them okay well we're still waiting to get a copy of that parental consent form that would involve children um my time is expired mr chairman i'll save my other questions for our second round if we're able to get those thank you great thank you senator blackburn senator klobuchar thank you very much mr chairman thank you so much miss huggin for shedding a light on how facebook
55:21
time and time again has put profit over people when their own research found that more than 13 percent of teen girls say that instagram made their thoughts of suicide worse what did they do they proposed instagram for kids which has now been put on pause because of public pressure when they found out that their algorithms are fostering polarization misinformation and hate that they allowed 99 percent of their violent contact to remain unchecked on their platform including
55:53
lead up to the january 6 insurrection what did they do they now as we know mark zuckerberg is going sailing and saying no apologies i think the time has come for action and i think you are the catalyst for that action you have said privacy legislation is not enough i completely agree with you but i think you know we have not done anything to update our privacy laws in this country our federal privacy laws nothing zilch in any major way why because there
56:24
are lobbyists around every single corner of this building that have been hired by the tech industry we have done nothing when it comes to making the algorithms more transparent allowing for the university research that you refer to why because facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them we have done nothing significantly past although we are on a bipartisan basis working in the antitrust subcommittee to get something done on consolidation which you
56:54
understand allows the dominant platforms to control all this like the bullies in the neighborhood buy out the companies that maybe could have competed with them and added the bells and whistles so the time for action is now so i'll start i'll start with something that i asked facebook's head of safety when she testified before us last week i asked her how they estimate the lifetime value of a user for kids who start using their products before they turn 13. she evaded the question and said that's not the way
57:26
we think about it is that right or is it your experience that facebook estimates and that and puts a value on how much money they get from users in general get to kids in a second is that a motivating force for them um based on what i saw in terms of allocation of integrity spending so one of the things disclosed in the wall street journal was that i believe it's like 87 of all the misinformation spending is spent on english but only about like nine percent of the users are
57:55
english speakers um it seems that that facebook invests more and users who make them more money even though the danger may not be evenly distributed based on profitability does it make sense that having a younger person get hooked on social media at a young age makes them more profitable over the long term as they have a life ahead of them facebook's internal documents talk about the importance of getting younger users for example tweens onto instagram like instagram kids because they need to have um
58:27
like they know that children bring their parents online and things like that and so they understand the value of younger users for the long term success of facebook facebook reported advertising revenue to be 51.58 cents per user last quarter in the us and canada uh when i asked ms davis how much of that came from instagram users under 18 she wouldn't say do you think that teens are profitable for their company i would assume so based on advertising for things like television uh you have much
58:58
substantially higher advertising rates for customers who don't yet have preferences or habits and so i'm sure they're some of the more profitable users on facebook but i do not work directly on that another major issue that's come out of this uh eating disorders uh studies have found that eating disorders actually have the highest mortality rate of any mental illness for women and i led a bill on this with senators capital and baldwin that we passed into law and i'm concerned that this algorithms
59:27
that they have pushes outrageous content promoting anorexia and the like i know it's personal to you uh do you think that their algorithms push some of this content to young girls facebook knows that they're the engagement based ranking the way that they pick the content in instagram for young users for all users um amplifies preferences and they have done something called a proactive a proactive incident response where they take things they've heard for example like
59:58
can you be led by the algorithms to anorexia content and they have literally recreated that experiment themselves and confirmed yes this this happens to people so facebook knows that they are that they are leading young users to anorexia content do you think they are deliberately designing their product to be addictive beyond even that content uh facebook has a long history of having a successful um and very effective um growth division um where they take little tiny tweaks they constantly constantly constantly are trying to optimize it to grow
01:00:30
those kinds of stickiness could be cons construed as things that facilitate addiction right last thing i'll ask is we've seen the same kind of content in the political world you brought up other countries and what's been happening there on 60 minutes you said that facebook implemented safeguards to reduce misinformation ahead of the 2020 election but turned off those safeguards right after the election and you know that the insurrection occurred january 6 do you think that facebook turned off the safeguards because they were costing the company
01:01:01
money because it was reducing profit facebook has been emphasizing a false choice they've said the safeguards that were in place before the election uh implicated free speech the choices that were happening on the platform were really about how reactive and twitchy was the platform right like how viral was the platform and facebook changed those safety defaults in the run-up to the election because they knew they were dangerous and because they wanted that growth back they wanted the acceleration of the platform back after the election they
01:01:34
they returned to their original defaults and the fact that they had to break the glass on january 6th and turn them back on i think that's deeply problematic agree thank you very much for your bravery and coming forward senator thune thank you mr chair and uh ranking member blackburn um i've been arguing for some time that it is time for congress to act and i think the question is always what is the correct way to do it the right way to do it consistent with
01:02:05
our first amendment right to free speech this committee doesn't have jurisdiction over the antitrust issue that's the judiciary committee and i'm not averse to looking at the monopolistic nature of facebook honestly i think that's a real issue that needs to be examined and perhaps addressed as well but at least under this committee's jurisdiction there are a couple of things i think we can do and i have a piece of legislation and senators blackburn and blumenthal are both co-sponsors called the filter bubble transparency act and essentially what it would do is
01:02:36
give users the options to engage with social media platforms without being manipulated by these secret formulas that essentially dictate the content that you see when you open up an app or log on to a website we also i think need to hold big tech accountable by reforming section 230 and one of the best opportunities i think to do that at least for in a bipartisan way is the platform accountability and consumer transparency or the pact act and that's legislation that i've co-sponsored with senator
01:03:07
schatz which in addition to stripping section 230 protections for content that a court determines to be illegal the packed act would also increase transparency and due process for users around the content moderation process and importantly in the context we're talking about today with this hearing with a major big tech whistleblower the packed act would explore the viability of a federal program for big tech employees to blow the whistle on wrongdoing inside the companies where they work in my view we should encourage employees
01:03:38
in the tech sector like you to speak up about uh questionable practices of big tech companies so we can among other things ensure that americans are fully aware of how social media platforms are using artificial intelligence and opaque algorithms to keep them hooked on the platform so let me miss halgen just ask you we've learned from the information that you've provided that facebook conducts what's called engagement based ranking which you've described is very dangerous could you talk more about why engagement-based ranking is dangerous
01:04:10
and do you think congress should seek to pass legislation like the filter bubble transparency act that would give users the ability to avoid engagement based ranking altogether facebook is going to say you don't want to give up engagement based ranking you're not going to like facebook as much if we're not picking out the content for you that's that's just not true there are a lot of facebook likes to present things as false choices like you have to choose between having lots of spam like let's say imagine we ordered our feeds by time
01:04:43
like on imessage or on um there are other forms of social media that are chronologically based they're going to say you're going to get spam you're spammed like you're not going to enjoy your feed the reality is that those experiences have a lot of permutations there are ways that we can make those experiences where computers don't regulate what we see we together socially regulate what we see but they don't want us to have that conversation because facebook knows that when they pick out their the content that we focus on using computers we
01:05:14
spend more time on their platform they make more money um the dangers of engagement-based ranking are that facebook knows that content that elicits an extreme reaction from you is more likely to get a click a comment or re-share and it's interesting because those clicks and comments and reshares aren't even necessarily for your benefit it's because they know that other people will produce more content if they get the likes and comments and ratios they prioritize content in your feed so you
01:05:45
will give little hints of dopamine to your friends so they will create more content and they have run experiments on people produce their site experiments where they have confirmed this so you you and your um part of the information you provided the wall street journal uh it's been found that facebook altered its algorithm in attempt to boost these meaningful social interactions or msi but rather than strengthening bonds between family and friends on the platform the algorithm instead rewarded more outrage and sensationalism and i
01:06:16
think facebook would say that its algorithms are used to connect individuals with other friends and family that are largely positive do you believe that facebook's algorithms make its platform a better place for most users and should consumers have the option to use facebook and instagram without being manipulated by algorithms designed to keep them engaged on that platform i strongly believe like i've spent most of my career working on systems like engagement based ranking like when i come to you and say these things i'm
01:06:46
basically damning 10 years of my own work right um engagement based ranking facebook says we can do it safely because we have ai you know the artificial intelligence will find the bad content that we know our engagement-based ranking is promoting they've written blog posts on how they know engagement-based ranking is dangerous but the ai will save us facebook's own research says they cannot adequately identify dangerous content and as a result those dangerous algorithms that they admit are picking
01:07:16
up the the the extreme sentiments the division they can't protect us from the harms that they know exist in their own system and so i i i don't think it's just a question of saying should people have the option of choosing to not be manipulated by their algorithms i think if we had appropriate oversight or if we were formed 230 to make facebook responsible for the consequences of their intentional ranking decisions i think they would they would get rid of engagement based ranking because it is causing um teenagers to be exposed to more anorexia
01:07:48
content it is pulling families apart and in places like ethiopia it's literally fanning ethnic violence i encourage reform of these platforms not not picking and choosing individual ideas instead making the platforms themselves safer less twitchy less reactive less viral because that's how we scalably solve these problems thank you miss chair i i would uh simply say let's uh let's get to work so we got some things we can do here thanks i agree thank you uh senator shot
01:08:20
thank you mr chairman ranking member thank you for your courage in coming forward was there a particular moment when you came to the conclusion that reform from the inside was impossible and that you decided to be a whistleblower was a long series of moments where i became aware that uh facebook went faced with conflicts of interest between its own profits and the common good public safety that facebook consistently chose to prioritize its profits i think the
01:08:52
moment which i realized we needed to get help from the outside that the only way these problems would be solved is by solving them together not solving them alone was when civic integrity was dissolved following the 2020 election it really felt like a betrayal of the promises that facebook had made to people who had sacrificed a great deal to keep the election safe by basically dissolving our community and integrating in just other parts of the company and when i know their their response is that they've sort of distributed the duties
01:09:23
yeah that's an excuse right um i i cannot see into the hearts of other men and i i i don't know let me say it this way it won't work right i i i can tell you that when i left the company so my the people who i worked with were disproportionately maybe 75 of my pod of seven people um so those are product managers program managers most of them had come from civic integrity all of us left the inauthentic behavior pod
01:09:54
either for other parts of the company or or the company entirely over the same six-week period of time so six months after the reorganization we had clearly lost faith that those changes were coming you said in your opening statement that they know how to make facebook and instagram safer so thought experiment you are now the chief executive officer and chairman of the company what changes would you immediately institute um i would immediately establish a policy of
01:10:26
how to share information and research from inside the company with appropriate oversight bodies like congress i would i would give proposed legislation to congress saying here's what an effective oversight agency would look like i would actively engage with academics to make sure that that people who are who are confirming our facebook's marketing message is true have the information they need to confirm these things and i would um immediately implement the quote soft interventions that were
01:10:56
identified to protect the 2020 election so that's things like requiring someone to click on a link before resharing it because other companies like twitter have found that that significantly reduces misinformation no one is censored by being forced to click on a link before we sharing it thank you i want to pivot back to instagram's targeting of kids we all know that they announced a pause but that reminds me of what they announced when they were gonna
01:11:26
issue a digital currency and they got beat up by the u.s senate banking committee and they said never mind and now they're coming back around hoping that nobody notices that they are going to try to issue a currency now let's set aside for the moment the sort of the the business model which appears to be gobble up everything do everything that's the gross growth strategy do you believe that they're actually going to discontinue instagram kids or they're just waiting for the dust to settle
01:11:57
um i would be sincerely surprised if they do not continue working on instagram kids and i would be amazed if a year from now we don't have this conversation again why facebook understands that if they want to continue to grow they have to find new users they have to make sure that that the next generation is just as engaged with instagram as the current one and the way they'll do that is by making sure that children establish habits before they have good self-regulation by hooking kids by
01:12:28
hooking kids i would like to emphasize one of the documents that we send in on problematic use examined the rates of problematic use by age and that peaked with 14 year olds it's it's just like cigarettes teenagers don't have good self-regulation they say explicitly i feel bad when i use instagram and yet i can't stop we need to protect the kids just my final question i have a long list of misstatements misdirections and outright lies from the company i
01:13:01
don't have time to read them but you're as intimate with all these deceptions as i am so i will just jump to the and uh if you were a member of this panel would you believe what facebook is saying i would not believe uh facebook has not earned our right to just have blind trust in them trust is last week one of the most beautiful things that i heard on the on the committee was um trust is earned and facebook has not
01:13:32
earned our trust thank you thanks senator schatz uh senator moran uh and then uh we've been joined by the chair senator cantwell she'll be next we're going to break it about 11 30 if that's okay because we have a vote um and then we'll reconvene okay mr chairman thank you the conversation so far reminds me that you and i ought to resolve our differences and introduce legislation so as senator thune said let's go to work our
01:14:05
differences are very minor or they seem very minor in the face of the revelations that we've now seen so i'm hoping we can move forward senator miranda i i share that view mr chairman thank you uh thank you very much uh for your testimony what examples do you know we've talked about uh particularly children teenage girls in specifically but what other examples do you know about where facebook or instagram knew its decisions would be harmful uh
01:14:37
to its users but still proceeded uh with the with the uh plan uh and executed those uh harmful uh uh that harmful behavior facebook's internal research is aware that there are a variety of problems facing children on instagram that are um uh they know that severe harm is happening to children for example in the case of bullying uh facebook knows that instagram dramatically changes the experience of high school so when we were in high
01:15:09
school when i was in high school uh most kids you looked at me and changed your your abortion sorry um the when i was in high school you know or most kids have positive home lives like it doesn't matter how bad it is at school kids can go home and reset for 16 hours kids kids who are bullied on instagram the bullying follows them home it follows them into their bedrooms the last thing they see before they go to bed at night is someone being cruel to them or the first thing they see in the morning is someone being cruel to them
01:15:40
kids are learning that their own friends like people who they care about them are cruel to them like think about how that's going to impact their domestic relationships when they become 20 somethings or 30-somethings to believe that people who care about you are mean to you facebook knows that parents today because they didn't experience these things they never experienced this addictive experience with a piece of technology they give their children bad advice they say things like why don't you just stop using it and so that facebook's own research is aware that children express feelings of
01:16:11
loneliness and struggling with these things because they can't even get support from their own parents i don't understand how facebook can know all these things and not escalate it to someone like congress for help and support in navigating these problems let me ask the question in a broader way besides teenagers or besides girls or besides youth are there other practices at facebook or instagram that are known to be harmful but yet are pursued um uh facebook is aware that choices are
01:16:45
made in establishing like meaningful social meaningful social interactions so engagement based ranking that didn't care if you bullied someone or committed hate speech in the comments that was meaningful they know that that change directly changed publishers behavior that companies like buzzfeed wrote in and said the content is most successful on our platform is some of the content we're most ashamed of you have a problem with your ranking and they did nothing they know that uh politicians are being forced to take positions they know their own constituents don't like or approve
01:17:16
of because those are the ones that get distributed on facebook that's a huge huge negative impact they all facebook also knows that they have admitted in public that engagement based ranking is dangerous without integrity and security systems but then not rolled out those integrity and security systems to most of the languages in the world and that's what causing things like ethnic violence in ethiopia thank you for your answer what is the magnitude of facebook's revenues or profits that come from
01:17:47
the sale of user data oh i'm i'm sorry i've never worked on that i'm not aware thank you um what regulations or legal actions by congress or by administrative action do you think would have the most consequence or would be feared most by facebook instagram or allied companies um i strongly encourage reforming section 230 to exempt decisions about algorithms right so modifying 230 around content i think has
01:18:18
uh it's very complicated because uh user generated content is something that companies have less control over they have a hundred percent control over their algorithms and facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety they shouldn't get a free pass on that because they're paying for their profits right now with our safety so i strongly encourage reform of 230 in that way i also believe there needs to be a dedicated oversight body because right
01:18:50
now the only people in the world who are trained to analyze these experiments to understand what's happening inside of facebook are people who you know grew up inside of facebook or pinterest or another social media company and there needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this and and have a place to work on things like regulation to bring that information out to the oversight boards that that have the right to do oversight a regulatory agency within the federal government yes
01:19:20
thank you very much thank you chairman senator cantwell thank you mr chairman thank you for holding this hearing and i think my colleagues have brought up a lot of important issues and so i think i just want to continue on that vein um first of all the privacy act that i introduced along with several of my colleagues actually does have ftc oversight of algorithm transparency in some instances i'd hope you take a look at that and tell us what other areas you think we should add to that level of transparency
01:19:50
but um clearly that's the the issue at hand here i think in your coming forward so thank you again for your willingness to do that um the documentation that you say now exists is uh the level of transparency about what's going on that people haven't been able to see and so your information that you say has gone up to the highest levels at facebook is that they purposely knew that their algorithms were continuing to have misinformation and hate information
01:20:23
and that when presented with information about this terminology uh you know downstream msi meaningful social information knowing that it was this choice you could continue this wrong-headed information hate information about the rohingya or you could continue to get higher click-through rates and i know you said you don't know about profits but i'm pretty sure you know that on a page if you click through that next page i'm pretty sure there's a lot more ad revenue than if you didn't click through
01:20:53
so you're saying that documents exist that at the highest level at facebook you had information discussing these two choices and that people chose even though they knew that it was misinformation and hurtful and maybe even causing people lives they continued to choose profit we have submitted documents to congress outlining um mark zuckerberg was directly presented with a list of quote soft interventions so a hard intervention is like taking a piece of content off facebook taking a user off
01:21:23
facebook soft interventions are about making slightly different choices to make the platform less viral less twitchy um mark was presented with these options and chose to not remove downstream msi in april of 2020 even though he in and even just isolated in at-risk countries that's countries at risk of violence um if it had any impact on the overall msi metric so he chose which in translation means less money yeah he he said right was
01:21:56
there another reason given why they would do it other than they thought it would really affect their numbers um i don't don't know for certain like jeff jeff horowitz the report for the wall street journal and i struggled with this we sat there and read these minutes and we're like how is this possible like we've just read 100 pages on how downstream msi expands hate speech misinformation violence inciting content graphic violent content why won't you get rid of this and we the best theory that we've come up with and i want to emphasize this is just our
01:22:26
interpretation on it is people's bonuses are tied to msi right like people people stay or leave the company based on what they get paid and like if you hurt msi a bunch of people didn't weren't going to get their bonuses so you're saying that this practice even still continues today like we're still in this environment i'm personally very frustrated by this because we presented information to facebook from one of my own constituents in 2018 talking about this issue with rohingya
01:22:57
pleading with the company we pleaded with the company and they continue to not address this issue now you're pointing out that these same algorithms are being used and they know darn well in ethiopia that it's causing an inciting violence and again they are still today choosing profit over taking this information down is that correct when rioting began in the united states in the summer of last year they turned off downstream msi only for when they detected content was health content which is probably coveted and um civic
01:23:28
content but facebook's own algorithms are bad at finding this content it's still in the raw form for 80 90 of of even that sensitive content in countries where they don't have integrity systems in the language local language and in the case of ethiopia there are 100 million people in ethiopia in six languages facebook only supports two of those languages for integrity systems this strategy of focusing on language specific content specific systems ai to save us is doomed to fail how i need to
01:23:59
get to one of them first of all i'm sending a letter to facebook today they better not delete any information as it relates to the rohingya our investigations about how they proceeded on this particularly on in light of your information or the documents but aren't we also now talking about advertising fraud aren't you selling something to advertisers that's not really what they're getting we know about this because of the newspaper issues we're trying to say that journalism that basically has to meet a different standard a public interest standard that basically is out there basically proving
01:24:30
every day or they can be sued these guys are a social media platform that doesn't have to live with that and then the consequences they're telling their advertisers that this was the price we see it we see it people are coming back to the local journalism because they're like we want to be again with the trusted brand we don't want to be in you know your website so i i think you're finding for the sec is an interesting one but i think that we also have to look at what are the other issues here and one of them is did you defraud did they defraud advertisers and telling them this was the advertising content that you were going
01:25:01
to be advertising and when in reality it was something different it was based on a different model we have multiple examples of um question and answers for the advertising staff the sales staff where advertisers say after the riots last summer we're asked should we come back to facebook or after the insurrection like should we come back to facebook um and facebook said in their talking points that they gave to advertisers we're doing everything in our power to make this safer or we take down all the hate speech when we find it but facebook's owning and that was not true that was not true they get three to
01:25:32
five percent of hate speech thank you thank you mr chairman thanks senator cantwell um and if you want to make your letter available to other members of the committee i'd be glad to join you myself thank you thank you for suggesting it thank you um senator lee thank you mr chairman and thank you ms haugen for joining us this week it's very very helpful we're grateful that you're willing to make yourself available last week we had another witness from
01:26:03
facebook um ms davis she came and she testified before this committee and she focused on among other things the extent to which facebook targets ads to children including ads that are either sexually suggestive or geared toward adult themed products or themes in general now i didn't i well i appreciated her willingness to be here i didn't get the clearest answers in response to some of those questions and so i'm hoping that you can help shed some light
01:26:34
on some of those issues related to facebook's advertising processes here today as we get into this i want to first read you a a quote that i got from from ms davis last week um here's what she said during her questioning when we do ads to young people there are only three things that an advertiser can target around age gender location we also prohibit certain ads to young people including weight loss ads we don't allow tobacco ads at all
01:27:06
many to young people we don't allow them to children we don't allow them to minors close quote now since that exchange happened last week there are a number of individuals and groups including a group called the technology transparency project or tdp that have indicated that that part of our testimony was inaccurate that it was false tdp noted that um tdp had conducted an experiment just last month
01:27:36
and their their goal was to run a series of ads that would be targeted to children ages 13 to 17 to users in the united states now i want to emphasize that tdp didn't end up running these ads they stopped them from being distributed to to users but facebook did in fact approve them and as i understand it facebook approved them for an audience of up to 9.1 million users all of whom were teens so
01:28:08
i brought a few of these to show you today this is this is the first one i wanted to showcase this first one has a colorful graphic uh encouraging kids to quote throw a skittles party like no other um which you know as the graphic indicates and as as the slang jargon also independently suggests this involves kids getting together randomly to abuse prescription drugs the second graphic displays
01:28:39
an anna tip that is a tip specifically designed to encourage and promote anorexia and it's on there now the language the ana tip itself independently promotes that the ad also promotes it insofar as it was suggesting these are images you ought to look at when you need motivation to be more anorexic i guess you could say now the third one invites children to find their partner online and to make a
01:29:11
love connection you look lonely find your partner now to make a love connection now look it'd be entirely different kettle of fish if this were targeted to an adult audience it is not it's targeted to 13 to 17 year olds now obviously i don't support and and tdp does not support these messages particularly when targeted to impressionable children and and again just to be clear tdp did not end up pushing the ads out after receiving facebook's approval
01:29:42
but it did in fact receive facebook's approval um so i i think this says something one could argue that it proves that facebook is allowing and and perhaps facilitating the targeting of harmful adult-themed ads to our nation's children so could you please explain to me uh ms haugen um how these ads uh with a target audience of 13 to 17 year old children
01:30:14
how would they possibly be approved by facebook and is ai involved in that um i did not work directly on the ad approval system um uh what was resonant for me about your testimony is facebook has a deep focus on scale so scale is can we do things very cheaply for a huge number of people which is part of why they rely on ai so much it is very possible that none of those ads were seen by a human and the reality is that we've seen from repeated
01:30:46
documents within my disclosures is that facebook's ai systems only catch a very tiny minority of offending content and best case scenario in the case of something like hate speech at most they will ever get 10 to 20 in the case of children that means drug paraphernalia ads like that um it's likely if they rely on computers and not humans they will also likely never get more than 10 to 20 of those ads understood mr chairman i've got one minor follow-up question would uh it should be easy to answer go ahead
01:31:18
um so well facebook may claim that it only targets ads based on age gender and location even though these things uh seem to counteract that but let's set that aside for a minute um and that they're not basing ads uh based on specific interest categories does facebook still collect interest category data on teenagers even if they aren't at that moment
01:31:48
targeting ads at teens based on those interest categories i think it's very important to differentiate between what targeting are advertisers allowed to specify and what targeting facebook may learn for an ad let's imagine you had some text on an ad it would likely extract out features that if thought was relevant for that ad for example um in the case of something about partying it would learn partying is a concept i'm very suspicious that personalized ads are still not being delivered
01:32:20
to teenagers on instagram because the algorithms learn correlations they learn interactions where your party ad may still go to kids interested in partying because facebook is prof is almost certainly has a ranking model in the background that it says this person wants more party related content interesting uh thank you that's very helpful and what that suggests to me is that while they're they're saying they're not targeting teens with those ads the algorithm might do some of that work for them which
01:32:51
might explain why they collect the data even while claiming that they're not targeting those ads in that way i can't speak to whether or not that's the intention but the reality is it's very very very difficult to understand these algorithms today and over and over and over again we saw these biases the algorithms unintentionally learn and so yeah it's very hard to just disentangle out these factors as long as you have engagement based ranking thank you miss haven thank you very much senator lee uh senator markey
01:33:22
thank you mr chairman very much um thank you miss hogan you you are a 21st century american hero warning our country of the danger for young people for our democracy and our nation owes you uh a just a huge debt of gratitude for the courage you're showing here today so thank you um miss hogan um do you agree that facebook actively seeks to attract children and
01:33:53
teens onto its platforms and facebook actively markets to children or markets to children under the age of 18 um to get on instagram and definitely uh targets children as young as eight to be on um in messenger kids an internal facebook document from 2020 that you reveal reads why do we care about tweens they are a valuable but untapped audience so facebook only cares about children to the extent that they are of monetary value
01:34:25
last week facebook's global head of safety antigone davis told me that facebook does not allow targeting of certain harmful content to teens ms davis stated we don't allow weight loss ads to be shown to people under the age of 18. yet a recent study found that facebook permitted targeting of teens as young as 13 with ads that showed a young woman's thin waist promoting websites that glorify anorexia ms hogan
01:34:56
based on your time at facebook do you think facebook is telling the truth i think facebook uh has focused on scale over safety and it is likely that they are using artificial intelligence to try to identify harmful ads without allowing the public oversight to see what is the actual effectiveness of those safety systems you unearthed facebook's research about its harm to teens did you raise this issue with your supervisors um i did not work directly on
01:35:28
anything involving teen mental health this research was freely available to anyone in the company mr davis testified last week quote we don't allow tobacco ads at all we don't allow them to children either we don't allow alcohol ads to minors however researchers also found that facebook does allow targeting of teens with ads on vaping uh ms hogan based on your time at facebook do you think facebook is telling the truth um i do not have context on that that
01:35:59
issue um i assume that if they are using artificial intelligence to catch those vape ads unquestionably ads are making its way through okay um so from my perspective listening to you and your incredibly courageous uh revelations time and time again facebook says one thing and does another time and time again facebook fails to abide by the commitments that they had made time and time again facebook lies about what they are doing yesterday
01:36:30
facebook had a platform outage but for years it has had a principles outage its only real principle is profit facebook's platforms are not safe for young people as you said facebook is like big tobacco enticing young kids with that first cigarette that first social media account designed to hook kids as users for life ms hagan you know whistleblowing shows that facebook uses harmful features that quantify popularity push manipulative
01:37:01
influencer marketing amplify harmful content to teens and last week in this committee facebook wouldn't even commit to not using these features on 10-year goals facebook is built on computer codes of misconduct senator blumenthal and i have introduced the kids internet design and safety act the kids act you have asked us to act as a committee
01:37:33
and facebook has scores of lobbyists in the city right now coming in right after this hearing to tell us we can't act and have been successful for a decade in blocking this committee from acting so let me ask you a question the kids internet design and safety act are the kids act here's what the legislation does it includes outright bans on children's app features that one quantify popularity with likes and follow
01:38:05
accounts promotes two that two promotes influencer marketing and three that amplifies a toxic posts and that it would prohibit facebook from using its algorithms to promote toxic posts should we pass that legislation i strongly encourage reforms that push us towards human scale social media and not computer driven social media those amplification harms are caused by
01:38:36
computers choosing what's important to us not our friends and family and i encourage any system that children are exposed to to not use amplification systems so you agree that congress has to enact these special protections for children and teens that stop social media companies from manipulating young users and threatening their well-being to stop using its algorithm to harm kids you agree with that i i do believe congress must act protect children and children in teens also needed
01:39:06
privacy online bill of rights i'm the author of the children's online privacy protection act of 1998 but it's only for kids under 13 because the industry stopped me from making at age 16. in 1998 because it was already their business model but we need to update that law for the 21st century tell me if this should pass one create an online eraser button so that young users can tell websites to delete the data they've collected about them
01:39:37
two give young teens under the age of 16 and their parents control of their information and three ban targeted ads to children i support all those actions thank you and um and finally i've also introduced the algorithmic justice and online platform transparency act which would one open the hood on facebook and big text algorithms so we know how facebook
01:40:10
is using our data to decide what content we see and two band discriminatory algorithms that harm vulnerable populations online like showing employment and housing ads to white people but not to black people in our country should congress pass that bill algorithmic bias issues are a major issue for our democracy during my time at pinterest i became very
01:40:40
aware of the challenges of like i mentioned before it's difficult for us to understand how these algorithms actually act and perform facebook is aware of complaints today by people like african americans saying that reals doesn't um give african americans the same distribution as white people and until we have transparency and our ability to confirm ourselves the facebook's marketing messages are true we will not have a system that is compatible with democracy so i and i i thank senator lee i agree with you
01:41:12
uh in your line of questions i wrote facebook asking them to explain that discrepancy because facebook i think is lying about targeting 13 to 15 year olds so here's my message for mark zuckerberg your time of invading our privacy promoting toxic content and preying on children and teens is over congress will be taking action you can work with us or not work with us but we will not
01:41:44
allow your company to harm our children and our families and our democracy any longer thank you mr hogan we will act thanks senator markey uh we're going to turn to senator blackburn and then we will take a break i know that there's some interest in another round of questions maybe well uh maybe we'll turn to senator lujan
01:42:18
for his questions uh cruz and scott and we have others so we'll come back after the mr chairman i have to um go to sit in the chair um starting at noon today i don't uh we turn uh just you have to i do i have one question this relates to what mr markey was asking does facebook ever employ child psychologists or mental health
01:42:48
professionals to deal with these children online issues that we're discussing facebook has many researchers with phds i assume some of them are uh i know that some have psychology degrees i'm not sure if they are child specialists facebook also works with external agencies that are specialists at children's rights online senator lujan and then at the conclusion of senator lujan's questions we'll take a break we'll come back
01:43:20
at noon thank you mr chairman and i appreciate the indulgence of the committee uh miss hogan last week the committee heard directly from miss davis the global head of safety for facebook during the hearing the company contested their own internal research as if it does not exist yes or no does facebook have internal research indicating that instagram harms teens particularly harming perceptions of body
01:43:52
image which disproportionately affects young women yes facebook has extensive research on the impacts of its products on teenagers including young women thank you for confirming these reports last week i requested facebook make the basis of this research the data set minus any personally identifiable information available to this committee do you believe it is important for transparency and safety that facebook release the basis of this internal research the core data set to allow for
01:44:22
independent analysis i believe it is vitally important for our democracy that we establish mechanisms where facebook's internal research must be disclosed to the public on a regular basis and that we need to have privacy sensitive data sets that allow independent researchers to confirm whether or not facebook's marketing messages are actually true beyond this particular research should facebook make its internal primary research not just secondary slide decks of cherry pick data but the underlying data public by default can this be done
01:44:54
in a way that respects user privacy i believe in collaboration with academics and other researchers that we can develop privacy conscious ways of exposing radically more data that is available today it is important for our ability to understand how algorithms work how facebook shapes the information we get to see that we have these data sets be publicly available for scrutiny is facebook capable of making the right decision here on its own or is regulation needed to create real transparency at facebook until incentives change at facebook we should
01:45:24
not expect facebook to change we need action from congress last week i asked miss davis about shadow profiles for children on the site and she answered that no data is ever collected on children under 13 because they are not allowed to make accounts this tactfully ignores the issue facebook knows children use their platform however instead of seeing this as a problem to be solved facebook views this as a business opportunity yes or no does facebook conduct research on children under 13 examining the
01:45:55
business opportunities of connecting these young children to facebook's products i want to emphasize how vital it is that facebook should have to publish the mechanisms by which it tries to detect these children because they are on the platform in far greater numbers than anyone is aware i do believe that or i am aware that facebook is doing research on children under the age of 13 and they have those studies are included in my disclosure you have shared your concerns about how senior management at facebook has continuously prioritized revenue over
01:46:25
potential user harm and safety and i have a few questions on facebook's decision making last week i asked miss davis quote has facebook ever found a change to its platform would potentially inflict harm on users but facebook moved forward because the change would also grow users or increase revenue ms davis said in response quote it's not been my experience at all at facebook that's just not how we would approach it yes or
01:46:55
no has facebook ever found a feature on its platform harmed its users but the feature moved forward because it would also grow users or increase revenue facebook likes to paint that these issues are really complicated there are lots of simple issues for example requiring someone to click through on a link before you reshare it that's not a large imposition but it does decrease growth a tiny little amount because in some countries reshares make up 35 of all the content that people see facebook prioritized that content on the
01:47:27
system the reshares over the impacts to misinformation hate speech or violence incitement did these decisions ever come from mark zuckerberg directly or from other senior management at facebook we have a few choice documents that contain notes from briefings with mark zuckerberg where he chose metrics defined by facebook like meaningful social interactions over changes that would have significantly decreased misinformation hate speech and other inciting content and this is the reference you shared earlier to miss
01:47:58
cantwell april of 2020 the soft interventions facebook appeared to be able to count on the silence of its workforce for a long time even as it knowingly continued practices and policies that continue to cause and amplify harm facebook content moderators have called out a culture of fear and secrecy within the company that prevented them from speaking out is there a culture of fear at facebook around whistleblowing and external accountability uh facebook has a culture that that that
01:48:30
um emphasizes that that insularity is the path forward that if information is shared with the public it will just be misunderstood and i believe that relationship has to change the only way that we will solve these problems is by solving them together and we'll have much better more democratic solutions if we do it collaboratively than in isolation and my final question is there a senior level executive at facebook like an inspector general who's responsible for ensuring complaints from facebook employees are taken seriously and that employees legal
01:49:02
ethical and moral concerns receive consideration with the real possibility of instigating change to company policies um i'm not aware of that role but the company is large and it may exist i appreciate that it's my understanding that there's a gentleman uh by the name of roy austin who is the vice president of civil rights who's described himself as an inspector general but he does not have the authority to make these internal conflicts public the oversight board was created by facebook to review
01:49:33
moderation policies related to public content specifically it was not created to allow employees to raise concerns um so again another area of interest i believe that we have to act on i thank you for coming forward today my pleasure happy to serve the committee is in recess chief so the senate commerce science and transportation subcommittee on consumer protection product safety and data security has gone into recess to allow members to vote on the senate
01:51:09
floor they've been hearing testimony today from francis haugen a facebook whistleblower on her experiences at the social media company and how to better protect children online with privacy regulations and laws we'll return to live coverage when they resume here on c-span so while we wait for the committee to resume we'll show you a portion of the hearing from earlier today we're successful ah the meeting and hearing of the subcommittee on consumer protection of
01:52:21
the commerce committee will come to order uh i'm very pleased to welcome my colleagues and i want to thank ranking member senator blackburn for her cooperation and collaboration we've been working very closely and the ranking member who is here senator wicker as well as our chairwoman maria cantwell senator cantwell i'm sure will be here shortly uh most important i'd like to thank our
01:52:52
witness francis haugen for being here and the two council who are representing her today and i want to give you my heartfelt gratitude for your courage and strength in coming forward as you have done standing up to one of the most powerful implacable corporate giants in the history of the world without any exaggeration you have a compelling
01:53:22
credible voice which we've heard already but you are not here alone you're armed with documents and evidence and you speak volumes as they do about how facebook has put profits ahead of people among other revelations the information that you have provided to congress is powerful proof that facebook knew
01:53:52
its products were harming teenagers facebook exploited teens using powerful algorithms that amplified their insecurities and abuses through what it found was an addict's narrative there is a question which i hope you will discuss as to whether there is such a thing as a safe algorithm facebook saw teens creating secret accounts
01:54:23
that are often hidden from their parents as unique value proposition in their words a unique value proposition a way to drive up numbers for advertisers and shareholders at the expense of safety and it doubled down on targeting children pushing products on pre-teens not just teens but preteens that it knows are harmful to our kids mental health and well-being instead of
01:54:55
telling parents facebook concealed the facts it sought to stonewall and block this information from becoming public including to this committee when senator blackburn and i specifically asked the company and still even now as of just last thursday when a facebook witness came for this committee it has refused disclosure or even to tell us when it might decide whether to disclose additional documents
01:55:26
and they've continued their tactics even after they knew the destruction it caused it isn't just that they made money from these practices but they continued to profit from them their profit was more important than the pain that they caused last thursday the message from ms antigone davis facebook's global head of safety was simple quote this research is not a bombshell
01:55:58
end quote and she repeated the line not a bombshell well this research is the very definition of a bombshell facebook and big tech are facing a big tobacco omen a moment of reckoning the parallel is striking i sued big tobacco as connecticut's attorney general i helped to lead the states in that legal action and i remember very very well
01:56:30
the moment in the course of our litigation when we learned of those files that showed not only that big tobacco knew that its product caused cancer but that they had done the research they concealed the files and now we knew and the world knew and big tech now faces that big tobacco jaw-dropping moment of truth it is
01:57:00
documented proof that facebook knows its products can be addictive and toxic to children and it's not just that they made money again it's that they valued their profit more than the pain that they caused to children and their families the damage to self-interest and self-worth inflicted by facebook today will haunt a generation feelings of inadequacy and insecurity
01:57:34
rejection and self-hatred will impact this generation for years to come our children are the ones who are victims teens today looking at themselves in the mirror feel doubt and insecurity mark zuckerberg ought to be looking at himself in the mirror today and yet rather than taking responsibility and showing leadership
01:58:07
mr zuckerberg is going sailing his new modus operandi no apologies no admission no action nothing to see here mark zuckerberg you need to come before this committee you need to explain to francis haugen to us to the world and to the parents of america you were doing and why you did it instagram's business model is pretty straightforward more eyeballs more
01:58:40
dollars everything facebook does is to add more users and keep them on their apps for longer in order to hook us instagram uses our private information to precisely target us with content and recommendations assessing that what will provoke a reaction will keep us scrolling far too often these recommendations encourage our most destructive and dangerous
01:59:11
behaviors as we showed on thursday we created a fake account my office and i did as a teen interested in extreme dieting and eating disorders instagram latched on to that teenager's initial insecurities that then push more content and recommendations glorifying eating disorders that's how instagram's algorithms can push teens into darker and darker
01:59:43
places facebook's own researchers called it instagram's quote perfect storm exacerbating downward spirals facebook as you have put it is how can so powerfully maximizes profits and ignores pain facebook's failure to acknowledge and to act makes it morally bankrupt again and again facebook rejected reforms recommended by its own researchers
02:00:15
last week ms davis said quote we're looking at end quote no specific plans no commitments only vague platitude these documents that you have revealed provided this company with a blueprint for reform provided specific recommendations that could have made facebook and instagram safer the company repeatedly ignored those recommendations from its own
02:00:47
researchers that would have made facebook and instagram safer fake facebook researchers have suggested changing their recommendations to stop promoting accounts known to encourage dangerous body comparison instead of making meaningful changes facebook simply pays lip service and if they won't act
02:01:17
and if big tech won't act congress has to intervene privacy protection is long overdue senator markey and i have introduced the kids act which would ban addictive tactics that facebook uses to exploit children parents deserve better tools to protect their children i'm also a firm supporter of reforming section 230. we should consider narrowing this sweeping immunity when platforms algorithms
02:01:48
amplify illegal conduct you've commented on this in your testimony and perhaps you'll expand on it we have also heard compelling recommendations about requiring disclosures of research and independent reviews of these platforms algorithms and i plan to pursue these ideas the securities and exchange commission should investigate your contentions and claims ms haugen and so should the federal trade commission facebook appears to have
02:02:20
misled the public and investors and if that's correct it ought to face real penalties as a result of that misleading and deceptive misrepresentation i want to thank all my colleagues who are here today because what we have is a bipartisan congressional roadmap for reform that will safeguard and protect children from big tech that will be a focus of our subcommittee moving forward and it will continue
02:02:55
continue to be bipartisan and finally i'll just end on this note in the past weeks and days parents have contacted me with their stories heartbreaking and spine chilling stories about children pushed into eating disorders bullying online self-injury of the most disturbing kind and sometimes even taking their lives
02:03:26
because of social media parents are holding facebook accountable because of your bravery is happen and we need to hold accountable facebook and all big tech as well again my thanks to you i am going to enter into the record a letter from 52 state attorneys general and from two members of the youth advisory board of sandy hook promise as long as there's no objection and i will now turn to the ranking
02:03:57
member centre platform thank you mr chairman and thank you for entering that letter in the record that we have from our state's attorneys general good morning to everyone it is nice to see people in this hearing room and uh to be here for the hearing today miss haugen we thank you for your appearance before us today and for giving the opportunity not only for congress but for uh for the american people to hear from you in this setting and we
02:04:30
appreciate that mr chairman i think also thanks to you and your staff uh that have worked with our team to make certain that we had this hearing in this opportunity uh today so that we can get more insight into what facebook is actually doing as they invade the privacy not only of adults but of children and look at the ways that they are in violation of the children's online privacy protection act
02:05:02
which is federal law and looking at how they are evading that law and working around it and as the chairman said a privacy and online privacy passing a federal privacy standard has been long in the works i filed my first privacy bill when i was in the house back in 2012 and i think that it will be this congress and this subcommittee that is
02:05:33
going to lead the way to online privacy data security section 230 reforms and of course senator klobuchar always wants to talk about antitrust and i have to give a nod senator markey is down there when we were in the house we were probably two of the only ones who were talking about the need to have a federal privacy standard now as the chairman mentioned last week
02:06:02
we heard from miss davis who had global safety for facebook and it was surprising to us that what she tried to do was to minimize the information that was in these documents to minimize the research and to minimize the knowledge that facebook had at one point i even reminded her the research was not third-party research
02:06:32
the research was there facebook's internal research so they knew what they were doing they knew where the violations were and they know they are guilty they know this their research tells them this um last week in advance of our hearing facebook released two studies and said
02:07:04
that the wall street journal was all wrong they had just gotten it wrong as if the wall street journal did not know how to read these documents and how to work through this research having seen the data that you've presented and the other studies that facebook did not publicly share i feel pretty confident that it's facebook who has done the misrepresenting
02:07:35
to this committee here are some of the numbers that facebook chose not to share and mr chairman i think it's important that we look at these as we talk about the setting for this hearing what we learned last week what you and i have been learning over the past three years about big tech and facebook and here you go 66 of teen girls on instagram and 40 of teen boys experience negative social comparisons
02:08:07
this is facebook's research 52 percent of teen girls who experienced negative social comparison on instagram said it was caused by images related to beauty social comparison is worse on instagram because it is perceived as real life but based on celebrity standards social comparison mimics the grief cycle and includes a downward emotional spiral encompassing a range of emotions from
02:08:40
jealousy to self-proclaimed body dysmorphia facebook addiction which facebook calls conveniently problematic use is most severe in teams peaking at age 14. here's what else we know facebook is not interested in making significant changes to improve kids safety on their platforms at least not when that would result in losing
02:09:11
eyeballs on post or decreasing their ad revenues in fact facebook is running scared as they know that in their own words young adults are less active and less engaged on facebook and that they are running out of teens to add to instagram so teams are looking at other platforms like tick tock and facebook is only making those changes that add to its
02:09:39
users numbers and ultimately its profits follow the money so what are these changes allowing users to create multiple accounts that facebook does not delete and encouraging teams to create second accounts they can hide from their parents they are also studying younger and younger children as young as eight so that they can market to them and while miss davis says that kids below 13
02:10:12
are not allowed on facebook or instagram we know that they are because she told us that they recently had deleted 600 000 accounts from children under age 13. so how do you get that many underage accounts if you aren't turning a blind eye to them in the first place and then in order to try to clean it up you go to delete it and then you say oh by the way we just
02:10:42
in the last month deleted 600 000 underage accounts and speaking of turning a blind eye facebook turns a blind eye to user privacy news broke yesterday the the private data of over 1.5 billion that's right 1.5 billion facebook users is being sold on a hacking forum that's its biggest data breach to date
02:11:14
examples like this underscore my strong concerns about facebook collecting the data of kids and teens and what they are doing with it facebook also turns a blind eye toward blatant human exploitation taking place on its platform trafficking forced labor cartels the worst possible things one can imagine big tech companies have gotten away with abusing consumers for too long it is clear that facebook prioritizes profit
02:11:47
over the well-being of children and all users so as a mother and a grandmother this is an issue that is of particular concern to me so we thank you for being here today miss hoggin and we look forward to getting to the truth about what facebook is doing with users data and how they are abusing their privacy and how they show a lack of
02:12:17
respect for the individuals that are on their network we look forward to the testimony thank you mr chairman thanks senator black thank you senator blackman i don't know whether the ranking member would like to make it if you don't mind thank you um chairman blumenthal and and i will just take a moment or two uh and and i do appreciate being able to to speak as ranking member of the full committee this this ms halligan this is a this is a subcommittee hearing you see some
02:12:49
vacant seats uh there's pretty good attendance for a subcommittee uh there are also a lot of things going on so people will be coming and going but i'm i'm willing to predict that this will have almost 100 percent attendance by members of the subcommittee because of the importance of this subject matter so thanks for coming forward to share concerns about facebook's business practices particularly with respect to children and teens and of course that is the the main topic of our it's the title
02:13:21
of our hearing today protecting kids for your patience uh we're going to reconvene and we'll go to senator hickenlooper thank you mr chair thank you miss haugen for uh for your direct answers and for being willing to come out and you know provide such clarity on so many of these issues obviously facebook can manipulate its algorithms to attract users and i guess my question would be do you feel
02:13:57
in your humble opinion that you know simply maximizing profits no matter the societal impact that that is justified um and i think the question then would be that that's the the short question which i think i know the answer what impact uh facebook's bottom line would it have if the algorithm was changed to promote safety uh and to instead of to for change to to save the lives of young women rather
02:14:28
than putting them at risk and learn about the talk button um facebook today has a profit makes approximately 40 billion dollars a year in profit a lot of the changes that i'm talking about are are not going to make uh facebook an unprofitable company it just won't be a ludicrously profitable company like it is today um engagement based ranking which causes those amplification problems that leads young women from you know innocuous topics like healthy recipes to anorexia
02:15:04
content if it were removed face people would consume less content on facebook but facebook would still be profitable and so i i i encourage oversight and public scrutiny into how these algorithms work and the consequences of them right well and i appreciate that i uh i'm a former small business owner i owned uh uh started a brew pub back in 1988 uh and really was always we worked very
02:15:34
hard to to look again we weren't doing investigations but we were very sensitive to whether someone had too much to drink whether we had a frequent customer who was frequently putting himself at risk and and others obviously i think the the facebook business model puts uh well poses risk to to youth and to and to teens uh you care to compare it to cigarette companies which i thought was rightfully so um if this i guess the question is is this level of
02:16:07
risk appropriate uh or is there a level of risk that would be appropriate i think there's an opportunity to reframe some of these oversight actions so when we think of them as these trade-offs of like it's either profitability or safety i think that's a false choice and then reality the thing i'm asking for is a move from short-termism which is what facebook is run under today right is being led by metrics and not led by people and that with appropriate oversight and some of these constraints
02:16:38
it's possible that facebook actually a much more profitable company five or ten years down the road because it wasn't as toxic not as many people quit it but that's one of those counterfactuals that we can't actually test so regulation might actually make facebook more profitable over the long term right that's often the case i think the same could be set for automobiles and go down the list of all those things that there's so much pushback in the beginning um i also thought that the um the question of of how do we assess the impact to their
02:17:09
bottom line uh we had a representative facebook in here recently who talked about that eight out of ten uh facebook users feel their life is better and that their job is to get to ten out of ten maybe this is the two to twenty percent that they're missing i don't know how large that the demographic is of of people that are caught back up into this circus uh circuitous uh you know sense of of really taking them down into the wrong direction how many people that
02:17:40
is do you have any idea um that quote last week was really shocking to me because i don't know if you're aware of this but in the case of cigarettes uh only about 10 percent of people who smoke ever get lung cancer right um so the idea that you know 20 of your users could be facing uh serious mental health issues and that's not a problem is shocking um i also want to emphasize for people that that eating disorders are serious right they're going to be women walking around this planet in 60 years with brittle bones because of choices that facebook made around emphasizing profit today or there
02:18:13
are going to be women in 20 years who want to have babies who can't because they're infertile as a result of eating disorders today they're serious and i think there's an opportunity here for having public oversight and public involvement especially in matters that impact children great well thank you for being so direct on this and for stepping forward i yield back the floor sure thanks senator hickenlooper senator cruz thank you mr chairman ms hogan welcome thank you for your testimony
02:18:43
uh when it concerns facebook there are a number of concerns that this committee and congress has been focused on two of the biggest have been facebook's intentional targeting of kids with content that is harmful to the children and then secondly in a discrete issue is the pattern of facebook and social media engaging in political censorship i want to start with the first issue targeting kids [Music] as you're aware and as indeed the
02:19:12
documents that you provided indicated facebook according to the public reporting on it facebook's internal reports found that instagram makes quote body image issues worse for one in three teen girls and additionally it showed that quote 13 of british users and six percent of american users traced their desire to kill themselves to instagram uh is that a fair and accurate characterization of what facebook's
02:19:46
research concluded um i only know what i read in the documents that were included in my disclosure that is that is an accurate description of the ones that i have read i because facebook has not come forward with the total corpus of their known research i don't know what their other things say but yes there is documents that say those things so we had testimony last week in the senate with a witness from facebook who claimed that that uh that information was not accurate and needed to be in context now of course she wasn't willing to provide
02:20:16
the context the alleged mysterious context do you know of any context that would make those data anything other than horrifying and deeply disturbing um engagement ranking and these processes of amplification they impact all users of facebook the algorithms are very smart in the sense that they latch on to things that people want to continue to engage with and unfortunately in the case of teen girls and things like self-harm they develop these feedback cycles where children are
02:20:48
using instagram as to self-soothe but then are exposed to more and more content that makes them hate themselves this is a thing where we can't say 80 percent of kids are okay we need to say how do we save all the kids the wall street journal reported that mark zuckerberg was personally aware of this research do you have any information one way or the other as to mr zuckerberg's awareness of the research um we have a uh excuse me um one of the documents included in the disclosures it details something called project daisy which is an initiative to
02:21:19
remove likes off of instagram the internal research showed that removing likes off instagram is not effective as long as you leave comments on those posts and yet the research directly presented to mark zuckerberg said we should still pursue this as a feature to launch even though it's not effective because the government journalists and academics want us to do this like it would get us positive points of the public um that kind that kind of duplicity is why we need to have more transparency and
02:21:50
why if we want to have a system that is uh coherent with democracy we must have public oversight from congress do you know if facebook any of the research it conducted attempted to quantify how many teenage girls may have taken their lives because of facebook's products i am not aware of that research do you know if facebook made any changes when they got back that 13 of british users and six percent of american users traced their desire to kill themselves to instagram do you know if they made any changes
02:22:20
in response to that research to try to correct or mitigate that i found it very surprising that when antonia davis was confronted with this research last week she couldn't enumerate a five-point plan a 10-point plan of the actions that they took i also find it shocking that one once facebook had this research it didn't disclose it to the public because this is the kind of thing that should have oversight from congress so when you were at facebook were there discussions about how to respond to this this research um i did not work directly on uh issues concerning children these are just documents that were freely
02:22:52
available in the company so i am not aware of that okay do you have thoughts as to what kind of changes facebook could make to reduce or eliminate these harms you mentioned earlier concerns around free speech a lot of the things that i advocate for are around changing the mechanisms of amplification not around picking winners and losers in the marketplace the problem of what that means oh sure um so like i mentioned before you know like how on twitter if you have to click through on
02:23:23
a link before you re-share it small actions like that friction don't require picking good ideas and bad ideas they just make the platform less twitchy less reactive and facebook's internal research says that each one of those small actions dramatically reduces misinformation hate speech and violence inciting content on the platform so and we're running out of time but but on the second major topic of concern of facebook which is censorship based on what you've seen are you are you concerned about political
02:23:53
censorship at facebook and in big tech i believe you cannot have a system that uh has as big an impact on society as facebook does today with as little transparency as it does i am a strong proponent of chronological ranking uh ordering by time with a little bit of spammed emotion because i think um we don't want computers deciding what we focus on we should have software that is human scaled where humans have conversations together not computers facilitating who we get to hear from so
02:24:25
how could we get more transparency what would produce that um i strongly encourage the development of some kind of regulatory body that could work with academics work with researchers work with other government agencies to synthesize requests for data that are privacy conscious this is an area that i'm really passionate about um and because right now no one can force facebook to disclose data and facebook has been stonewalling us or even worse they gave inaccurate data to researchers as the scandal recently showed what data
02:24:57
should they turn over my time's expired so um uh for example um even data as simple as what integrity systems exist today and how well do they perform like there are lots and lots of people who facebook is conveying around the world that that facebook safety systems apply to their language and those people aren't aware that they're using a raw original dangerous version of facebook just basic actions like transparency would make a huge difference thank you
02:25:28
thanks senator cruz senator loomis thank you mr chairman and thank you for your testimony if you were in my seat today instead of your seat what documents or unanswered questions would you seek from facebook especially as it relates to children but even generally speaking i think any research regarding what facebook does problematic use i.e the addictiveness of the product is of vital importance and anything around what
02:26:01
facebook knows about parents lack of knowledge about the platform i only know about the documents that i have seen right i did not work on teens or child safety myself but in the documents that i read facebook articulates the idea that parents today are not aware of how dangerous instagram is and they because they themselves did not live through these experiences they can't coach their kids on basic safety things and so at a minimum facebook shaft to disclose what it knows in that context
02:26:32
okay so we're trying to protect individuals data that they're gathering have data privacy but have transparency in the manner in which the data is used can we bridge that gap imagine um i i think we reasonable people can have a conversation on how many people need to see a piece of content before it's not really private like if a hundred thousand people see
02:27:03
something is it private if 25 000 people see it is it private just disclosing the most popular content on the platform including statistics around what factors went into the promotion of that content would cause radically more transparency than we have today on how facebook chooses what we get to focus on how they shape our reality okay if if our focus is protecting the first amendment and our rights to free speech while very carefully regulating
02:27:36
data privacy i've heard there there are a number of things that are being discussed in congress everything from antitrust laws to calling facebook a utility to the idea that you just raised of a regulatory board of some sort that has authority to through understanding of the algorithms and how they're used and other
02:28:10
mechanisms that create what we see the the face of facebook so to speak um tell me a little more about how you envisioned that board working what is the in your mind based on your understanding of the company and the ill consequences uh what is the best approach to bridging the gap between keeping speech free and protecting individual privacy with
02:28:44
regard to data so i think those issues are they are independent issues so we can talk about free speech first which is having more transparency like facebook has solutions today that are not content-based and i am a strong advocate for non-content-based solutions because those solutions will also then protect the most vulnerable people in the world in a place like ethiopia where they speak six languages if you have something that focuses on good ideas and bad ideas those systems
02:29:14
don't work in diverse places so investing in non-content-based ways to slow the platform down not only protects our freedom of speech it protects people's lives the second question is around privacy and this question of how can we have oversight and have privacy there is lots and lots of research on how to abstract data sets so you're not showing people's names you might not even be showing the content of their post you might be showing data that is about the content of their post but not the post itself there are many ways to structure these data sets that are privacy
02:29:46
conscious and be the fact that facebook has walled off the ability to see even basic things about how the platform performs or in the case of their past academic research releasing inaccurate data or not being clear about how they pulled that data is just part of a pattern of behavior of facebook hiding behind walls and operating in the shadows and they have far too much power in our society to be allowed to continue to operate that way well i i had heard you make the analogy earlier to the tobacco industry and i
02:30:16
think that that's an appropriate analogy i i really believe we're searching for the best way to address the problem and i'm i i'm not sure that it is the heavy hands like breaking up companies or um uh calling them a utility uh which is why your approach of integrating people who understand
02:30:48
the math and the uses of the math with protecting privacy is intriguing to me so the more information that you can provide to us about how that might work to actually address the problem i i think would be helpful so um in my case this is an invitation to you uh to provide to my office or the committee information about how we can get at the
02:31:21
root of the problem that you've identified and can document and save people's privacy so uh i extend that invitation to you and i thank you for your testimony uh mr chairman i yield back thanks senator lummus lomas senator sullivan thank you mr chairman and i want to thank our witness here it's been a good hearing a lot of information has been learned particularly on the issue of how this is impacting our kids
02:32:01
i think we're going to look back 20 years from now and all of us are going to be like what in the hell were we thinking when we recognize the damage that it's done to a generation of kids do you agree with that ms hugin uh when facebook made statement has made statements in the past about how much benefit instagram is providing to kids mental health like kids are connecting who are once alone uh what i'm so surprised about that is if
02:32:32
if instagram is such a positive force have we have we seen a golden age of teenage mental health in the last 10 years no we've seen i've seen the opposite right we've seen escalating rates of suicide and depression amongst teenagers do you think those rates are at least in part driven by the social media phenomena there is a broad swath of research that supports the idea that usage of social media amplifies the risk for these mental health harms right now and this hearing's helping illuminate it we are seeing and facebook's on research
02:33:03
shows up right yeah say that again i said and facebook's own research shows that right the kids are saying kids are saying i am unhappy when i use instagram and i can't stop but if i leave i'm afraid i'll be ostracized right and that's that's so sad so they know that that's what their research shows so what do you think drives them to i had this discussion with the witness last week and i said well you know i think they called it their time out or stop i said but isn't that
02:33:34
incompatible with your business model because your business model is more time online more eyeballs online isn't that the fundamental elements of their business model facebook has had both an interesting opportunity and a hard challenge from being a closed system so they have had the opportunity to hide their problems and like often people do when they can hide their problems they get in over their heads and i think facebook needs an opportunity to have congress step in and say guess what you don't have to struggle by yourself
02:34:05
anymore you don't have to hide these things from us you don't have to pretend they're not problems you can declare moral bankruptcy and we can figure out how to fix these things together because we solve problems together we don't solve them alone and by moral bankruptcy one of the things that i appreciate the phrase that the chairman and you've been using is one of those elements which is they know this is a problem they know it's actually impacting negatively the mental health of the most precious assets we have in america our youth our kids i have three daughters um
02:34:37
they know that that is happening and yet the moral bankruptcy from your perspective is the continued the continuation of this simply because that's how they make money i i phrase it slightly differently we have financial bankruptcy because we value people's lives more than we value money right the people get in over their heads and they need a process where they admit they did something wrong but we have a mechanism where we forgive them and we have a way for them to move forward facebook is stuck in a feedback loop that they cannot get out of they have been hiding
02:35:08
this information because they feel trapped right like they would have come forward if they had solutions to these things they need to admit they did something wrong and they need help to solve these problems and that's what moral bankruptcy is let me ask i'm going to switch gears here and and this is uh you what's your current position right now in terms of its disinformation and counter espionage um i my last role at facebook was encounter sorry your last role okay yeah so one of the things this is a very different topic and um i only got a minute or so left but
02:35:40
right now is facebook i know facebook is not allowed in countries like china but do they provide platforms for authoritarian or terrorist space leaders like the ayatollahs in iran that's the largest state-sponsored terrorism in the world um or the taliban or xi jinping are certain my view our biggest rival for this century a communist party dictator who's trying to export
02:36:11
his authoritarian model around the world do they provide a platform for those kind of leaders who in my view clearly don't hold america's interests uh in mind because facebook provided i i during my time working with the threat intelligence org so i was a product manager supporting the threat uh the counter espionage team um my team directly worked on uh
02:36:42
tracking chinese participation on the platform surveilling say weaker populations in places around the world that you could actually find the chinese based on them doing these kinds of things so facebook yeah i'm sorry um we also saw active persuasion of say the iran government doing espionage on other state actors um so this is definitely thing that is happening and i believe facebook's consistent understaffing of the counter-espionage information operations and counter-terrorism teams
02:37:12
is a national security issue and i'm speaking to other parts of congress about that so you are saying in essence that the the the platform whether facebook knows it or not is being utilized by some of our adversaries in a way that helps push and promote their interests at the expense of americas yes facebook's so very aware that this is happening on the platform and i believe the fact that congress doesn't get a report of exactly how many people are working on these things internally is is unacceptable because you have a right to keep the american people safe
02:37:42
great thank you very much thanks senator sullivan uh you may have just opened an area for another year sorry yeah yeah i've i have strong national security concerns about how facebook operates today well mr chairman maybe we should right i mean it's a i'm not being at all facetious uh thank you for your questions on this topic and i know you have a busy schedule but we may want to discuss this issue with you members of our committee uh at least informally and if you'd be willing to come back for
02:38:13
another hearing that uh certainly is within the realm of possibility i haven't consulted the ranking member but uh or the chair woman but um thank you for your honesty and your candor on that topic uh senator scott thank you chairman um first off thanks for coming forward and thanks for coming forward in the matter that you wanted to have a have positive change so that's not always what what happens earlier this year i sent a letter to facebook and other social media platforms asking them to detail the
02:38:45
harmful impacts the effects on mental health their platforms have on uh children and teens so your reports reveal that facebook has been clearly fully aware of this for a while and the harmful impacts especially on young women so i think we all agree that's completely unacceptable and we've got to figure out how we protect the people that are vulnerable in this country from the harmful impacts of facebook and other social media platforms so first off do you think there should be um uh greater consideration for age when it comes to using any social media
02:39:17
i strongly encourage raising age limits to 16 or 18 years old based on looking at the data around problematic use or addiction on the platform and uh children's self-regulation issues so so i think you addressed this a little bit but why do you think facebook didn't address this publicly when they they figured out internally that they were having an adverse impact on young young people especially young women why didn't they come forward and say i've got we've got a problem we've got to figure this out i i have a huge amount of empathy for
02:39:49
for facebook these are really really hard questions and part of why i'm saying i think they feel a little a little trapped and isolated is the problems that are driving uh negative social comparison on instagram facebook's own research says instagram is actually distinctly worse than say tik tok or snapchat or reddit um because instagram tic talk is about doing fun things with your friends snapchat is about faces and augmented reality uh reddit is vaguely about ideas
02:40:20
um but instagram is about bodies and about comparing lifestyles and so i think there are real questions where like instagram would have to come in and think hard about their product or about like what is their product about and i think i don't think those answers are immediately obvious that's why i believe we need to solve problems together and not alone because collaborating with the public will give us better solutions so do you think facebook was trying to try and mitigate the problem i think within the set of incentives that they were working within they did the best they could unfortunately those
02:40:51
incentives are not sustainable and they are not acceptable uh in in our society do you think facebook and other social media platforms ought to be able to be required to report any harmful effects they have on young people uh one of the things that i found uh very interesting after the report in the wall street journal on teen mental health was that a a former executive at the company said facebook needs to be able to have private research and the part that i was offended by this was facebook has had some of this research on the negative effects of instagram on teenagers for years
02:41:23
i strongly support the idea that facebook should have a year maybe 18 months to have private research but given that they are the only people in the world who can do this kind of research that the public never gets to do it they shouldn't be allowed to keep secrets when people's lives are on the line so because because to be clear if they make 40 billion a year they have the resources to solve these problems they're choosing not to solve them yeah didn't that surprise you they wouldn't put more effort into this i know cause you know it's going to catch up with them eventually right yeah like like i mentioned earlier to senator
02:41:54
heckenlooper right coming in and having oversight might actually make facebook a more profitable company five or ten years from now because toxicity facebook's own research shows uh they have something called an integrity holdout these are people who don't get protections from integrity systems to see what happens to them and those people who deal with a more toxic painful version of facebook use facebook less and so one could could rat could reason a kinder friendlier more collaborative facebook might actually have more users
02:42:25
five years from now so it's in in everyone's interest do you think i've got a bill and there's a lot of bills that i think we've all talked about but mine is called the data act it's going to require express consent from users for large platforms to use algorithms on somebody you agree with that i mean what i mean what shouldn't we consent before they get to take our on everything about us and go sell it i think how they send things to us for for um selling personal data that uh that is an issue i believe people should have substantially more control over um
02:42:56
uh most people are not well informed on what the cost the cost personal costs of having their data sold are and so i worry about um pushing that choice back on individual consumers in terms of should people consent to working with algorithms i worry that if facebook is allowed to give users the choice of do you want an engagement based news feed or do you want a chronological news feed like ordered by time maybe a little spammed emotion that people will choose the more addictive option that engagement based
02:43:28
ranking even if it is leading their their daughters to eating disorders all right thank you thanks senator scott i think we have concluded the first round unless we're missing someone who is uh on line and not hearing anyone let's go to the second round thank you again for your patience i know you have a hard stop i think at 1 30. so we'll be respectful of that
02:44:00
limitation and i'll begin by asking a few questions first let me say senator klobuchar very aptly raised with you the principal obstacle to our achieving legislative reform in the past which is the tons of money spent on lobbyists and other kinds of influence peddling to use a pejorative word that is so
02:44:30
evident here in the united states congress some of it's dark money some of it is very overt but i guess the point i'd like to make to you personally is that your being here really sends a profound message to our nation that one person can really make a difference one person standing up speaking out can overcome a lot of those obstacles for us and you have crystallized
02:45:00
in a way our consciousness here you have been a catalyst i think for change in a way that we haven't seen and i've been working on these issues for 10 15 years and you have raised awareness in a way that i think is very unique so thank you not only for your risk taking and your courage and strength in standing up but also for the effect that it has had
02:45:31
uh and i also want to make another point you can tell me whether i'm correct or not i think there are other whistleblowers out there i think there are other truth-tellers in the tech world who want to come forward i think you're leading by example i think you are showing them that there's a path to make this industry more responsible and more caring about kids and about the nature of our public discourse generally or about the
02:46:02
strength of our democracy and i think you have given them a boost those whistleblowers out there and potentially coming forward i think that's tremendously important i think also again you can tell me if i'm wrong there are a lot of people on facebook who are cheering for you because there are public reports and i know of some of my friends in this world who tell me that
02:46:35
there are people working for facebook who wish they had the opportunity and the courage to come forward as you have done because they feel a lot of reservations about the way that facebook has used the platform used algorithms used content and pushed it on kids in this way so those are sort of hypotheses that i hope you can confirm
02:47:07
uh and i also would like to ask you because a lot of parents are watching right now so you've advised us on what you think we should do the reforms some of them that you think we should adopt stronger oversight authorized by congress better disclosure because right now facebook essentially is a black box yes for most of america facebook is a black box that's designed by mark zuckerberg
02:47:40
incorporated mark zuckerberg and his immediate codery and the buck stops with him and reform of section 203 so there's some legal responsibility so people have a day in court some kind of recourse legally when they're harmed by facebook because right now it has this broad immunity most of america has no idea essentially you can't sue facebook you have no recourse
02:48:16
most america doesn't know about section 230 and if you push a lot of members of congress they wouldn't know either it's actually slightly worse than that they facebook made a statement in a legal proceeding recently where they said they had the right to mislead the court because they had immunity right that 230 give them immunity so why should they have to tell the truth about what they're showing which is kind of shocking well it is shocking to a lawyer yeah
02:48:48
which some of us are uh it's also utter disregard and contempt for the rule of law and for the very legal structure that gives them that kind of protection so it's kind of a new low in corporate conduct at least in court uh so you've you provided us with some of the reforms that you think are
02:49:19
important and i think that the oversight goes a long way because it in turn would make public a lot of what is going on in this black box but for now since a lot of teens and tweens will be going home tonight as you've said to endure the bullying the eating disorders the invitations to feel insecure about themselves
02:49:52
heightened anxiety they have to live with the real world as it exists right now and they will be haunted for their lifetimes by these experiences what would you tell parents right now what would you advise them about what they can do because they need more tools and some of the proposals that have been mentioned here would give parents more tools to protect their children right now a lot of parents tell me they
02:50:22
feel powerless they need more information they're way behind their kids and their adeptness online and they feel that they need to be empowered in some way to protect their kids in the real world right now in real time so i offer you that open-ended opportunity to talk to us a little bit about your thoughts very rarely do you have one of these
02:50:54
generational shifts where uh the the generation that leads like parents who who who guide their children have such um a different set of experiences that they don't have the context to support their children in a safe way um there is an active need for schools or or maybe the national institutes of health to to make established information where if parents want to learn how they can support their kids it should be easy for them to know what is constructive and not
02:51:26
constructive because facebook's own research says kids today feel like they are struggling alone with all these issues because their parents can't guide them and one of the things i'm i'm sad is when i look on twitter is when people blame the parents for these problems with facebook they say just take your kid's phone away and the reality is those issues are a lot more complicated than that um and so we need to support parents because right now if facebook won't protect the kids we at least need to help the parents to protect the kids
02:51:56
if parents are anguished they are about this issue parents are hardly uncaring they need the tools they need to be empowered and i think that the the major encouragement for reforms is going to come from those parents and you have pointed out i think in general but i'd like you to just confirm for me uh this research
02:52:28
and the documents containing that research is not only findings and conclusions also recommendations for changes what i hear you saying is that again and again and again these recommendations were just rejected or disregarded correct uh there is a pattern of behavior that i saw at facebook of facebook choosing to prioritize its profits over people and any time that facebook faced even tiny hits to growth like point one percent of
02:53:00
sessions one percent of views that it shows its profits over safety and you mentioned i think bonuses tied to downstream msis decor msi yeah could you explain what you meant um so msi is meaningful social interaction uh facebook's internal governance is very much based around metrics so facebook is incredibly flat to the point where they have the largest open floor plan office in the world it's a quarter of a mile long and one room right they believe in flat
02:53:32
and instead of having internal governance they have metrics that people try to move in a world like that it doesn't matter that we now have multiple years of data saying msi maybe encouraging bad content might be making spaces where people are are scared where they are shown information that puts them at risk it's so hard to dislodge a ruler like that that a yardstick that you end up in a situation where because no one is taking leadership like no one
02:54:03
is intentionally designing these systems it's just many many people running in parallel all moving the metric that these problems get amplified and amplified and amplified and no one steps in to bring the solutions and i just want to finish and then i think we've been joined by senator young and then we'll go to senator blackburn and senator klobuchar you know i spent a number of years as an attorney general helping to lead litigation against big tobacco and
02:54:37
i came to hear from a lot of smokers how grateful they were ironically and unexpectedly that someone was fighting big tobacco because they felt they had been victimized as children they started smoking when they were 7 8 12 years old because big tobacco was hooking them and as we developed the research very methodically and purposefully addicting them at that early age when
02:55:08
they believed that they would make themselves more popular that they would be cool and hip if they began smoking and then nicotine hooked them now physiologically nicotine has addictive properties what is it about facebook's tactics of hooking young people that makes it similar to what big tobacco
02:55:40
has done facebook's own research about instagram contains quotes from kids saying i feel bad when i use instagram but i also feel like i can't stop right i know that the more time i spend on this the worse i feel but like i just can't like that they want the next click they want the next like they the the dopamine you know the little hits all the time and i i feel a lot of pain for those kids right like they they they say they fear
02:56:12
being ostracized if they step away from the platform so imagine you're in the situation in this relationship where every time you open the app it makes you feel worse but you also fear isolation if you don't um i think there's a huge opportunity here to make social media that makes kids feel good not feel bad and that we have an obligation to our youth to make sure that they're safe online thank you senator young um thank you for your compelling testimony in that testimony you discuss how
02:56:48
facebook generates self-harm and and self-hate especially among vulnerable groups like teenage girls i happen to be a father of four kids three daughters two of whom are teenagers and as you as you just alluded to most adults myself included i've never been a teenager during the age of facebook instagram and these other social media platforms
02:57:18
and therefore i think it can be really hard for many of us to fully appreciate the impact that certain posts may have so can you discuss on a teen's mental health so can you discuss the short and long-term consequences of body image issues on these platforms please the patterns that children establish in their teenage years live with them for the rest of their lives the way they conceptualize who they are how they
02:57:53
conceptualize how they interact with other people are patterns and habits they will take with them as they become adults as they themselves raise children i'm very scared about the upcoming generation because when you and i interact in person and i say something mean to you and i see you wince or i see you cry that makes me less likely to do it the next time right that's a feedback cycle online kids don't get those cues and they learn to be incredibly cruel to each other and they normalize it
02:58:24
and i'm scared of what will their lives look like where they grow up with the idea that it's okay to be treated badly by people who who allegedly care about them that's a scary future very scary future and i see some evidence of that as to so many parents on a on a regular basis are are there other specific issues of significant consequence that the general public may not be fully aware of that are impacting um vulnerable groups that
02:58:58
you you just like to elevate uh during this testimony one of the things that's hard uh for people who don't look at the data of social networks every day is of harms or just of usage that there are these things called power laws it means that a small number of users are extremely intensely engaged on any given topic and most people are just lightly engaged when you look at things like misinformation facebook knows that the people who are exposed to the most misinformation are people who are
02:59:28
recently widowed divorced moved to a new city are isolated in some other way when i worked on civic misinformation we discussed the idea of the misinformation burden like the idea that when people are exposed to ideas that are not true over and over again it erodes their ability to to connect with the community at large because they no longer adhere to facts that are consensus reality the fact that facebook knows that it's most vulnerable users people who
02:59:59
recently widowed like that they're isolated that that the systems that are meant to keep them safe like demoting misinformation stop working when people look at 2000 posts a day right and i just it breaks my heart the idea that these rabbit holes would suck people down and then make it hard to connect with others so miss miss hodgin i desperately want to uh which is the american impulse i want to solve this problem and i i very much uh believe that uh congress
03:00:32
not only has a role but has a responsibility to figure this out i don't pretend to have all the answers i would value your opinion though as to whether you believe that breaking up facebook would solve any of the problems that you've discussed today you think it would so as an algorithmic specialist so this is someone who designs algorithmic experiences i'm actually against the breaking up of facebook because even looking inside of just facebook itself
03:01:03
so not even facebook and instagram you see the problems of engagement based ranking repeat themselves so the problems here are about the design of algorithms of ai and the idea that ai is not intelligent and if you break up instagram and facebook from each other it's likely so i used to work on pinterest and the thing that we faced from a business model perspective was that advertisers didn't want to learn multiple advertising platforms that they wanted to learn they got one platform for instagram and facebook and whatever and learning a second one for
03:01:35
pinterest pinterest made radically fewer dollars per user and what i'm scared of is right now facebook is the internet for lots of the world if you go to africa the internet is facebook if you split facebook and instagram apart it's likely that most advertising dollars will go to instagram and facebook will continue to be this frankenstein that is altering like that is endangering lives around the world only now there won't be money to fund it and so i think oversight and regular oversight and um finding collaborative solutions with congress is going to be key because
03:02:07
these systems are going to continue to exist and be dangerous even if broken up thank you thank you senator blackburn uh thank you mr chairman uh i have a text that was just put up by facebook spokesperson angela stone it says just pointing out the fact that uh francis hogan did not work on child safety or instagram or research these issues and has no direct knowledge of
03:02:41
the topic from her work at facebook so i will simply say this to mr stone if facebook wants to discuss their targeting of children if they want to discuss their practices of privacy invasion or violations of the children online privacy act i am extending to you an invitation to step forward be sworn in and testify
03:03:13
before this committee we would be pleased to hear from you and welcome your testimony one quick question for you what's the biggest threat to facebook's existence is it greed is it regulators is it becoming uh extinct or obsolete for teenage users what is the biggest threat to their existence i think the fact that facebook is driven
03:03:44
so much by metrics and that these lead to a very heavy emphasis on short-termism that every little individual decision may seem like it helps with growth but if it makes it more and more toxic platform that people don't actually enjoy like when they passed meaningful social interact meaningful social interactions back in 2018 facebook's own research said that users said it made it less meaningful right i think this aggregated set of short-term decisions endangers facebook's future
03:04:15
but sometimes we need to pull it away from business as usual help it write new rules if we wanted to be successful in the future so they can't see the forest for the dream yes yes thank you and i know senator klobuchar is waiting so i'll yield my time back and i thank you thanks senator blackburn thank you very much and thank you to both of you for leadership and all three of us are on the judiciary committee so we're also working on a host of other in issues including
03:04:46
the app store issues which is unrelated to facebook actually uh including issues relating to uh dominant platforms when they promote their own content or engage in exclusionary conduct which i know is not our topic today i see the thumbs up from you ms hoggin which i appreciate um and i think this idea of establishing some rules of the road uh for these tech platforms goes beyond the kid protection that we so dearly need to do and i just want to make sure
03:05:17
you agree with me on that totally uh i was shocked when i saw the new york times story a couple weeks ago about facebook using its own platform to promote positive news about itself i was like wow i knew you shaped our reality i wasn't aware it was that much right and that's a lot of the work uh that we're doing over there so i want to get to something senator young was talking about misinformation and senator lujan and i have put together a an exception actually to the 230 immunity when it comes to vaccine misinformation in the middle of a public
03:05:48
health crisis last week youtube announced it was swiftly banning all anti-vaccine misinformation and i've long called on facebook to take similar steps they've take taken some steps but do you think they can remove this content and do they put sufficient resources we know the effect of this we know that over half the people that haven't gotten the vaccines it's because of something that they've seen on social media i know the guy i walked into a cafe and said his mother-in-law wouldn't get a vaccine
03:06:18
because she thought a microchip would be planted in her arm could you which is false i'm just saying that for the record here um could in case it gets put on social media could you uh talk about are there signs are there enough resources to stop this from happening um i do not believe facebook ad is currently structured has the capability to stop vaccine misinformation because they're overly reliant on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 percent of content there you go
03:06:49
and yet it's a company that what the cap over a trillion dollars from the world's biggest companies that we've ever known and that's what really bothers me here uh senator lujan and i also have pointed out the issue with content moderators does facebook have enough content moderations for content in spanish and other languages besides english um one of the things that was disclosed we have we have documentation that shows how much operational investment there was by different languages and it showed
03:07:23
a consistent pattern of under investment in languages that are not english um i am deeply concerned about facebook's ability to operate in a safe way in languages beyond maybe the top 20 in the world okay thank you we go back to eating disorders today you've said that you have documents indicating facebook is doing studies on kids under 13 even though technically no kids under 13 are permitted on the platform uh the potential for eating disorder content to be shown to these children raises serious concern senator
03:07:54
blumenthal's been working on this i've long been focused on this eating disorder issue given the mortality rates are you aware of studies facebook has conducted about whether kids under 13 under 13 on the platform are nudged towards content related to eating disorders or unhealthy diet practices cnn also did investigation on this front i have not seen specific studies regarding eating disorders in under the age of 13 but i have seen research that indicates that they are aware
03:08:26
that teenagers coach tweens who are on the platform to not reveal too much to um not post too often and that they have categorized that as a myth that you can't be authentic on the platform and that the marketing team should talk should try to advertise to teenagers to stop coaching tweens that way um so we i believe we've shared that document with congress exactly well thank you and we'll be looking more speaking of the research issue um facebook has tried to downplay the
03:08:58
internal research that was done saying it was unreal reliable it seems to me that they're trying to mislead us there the research was extensive surveying hundreds of thousands of people traveling around the world to interview users in your view are the internal researchers at facebook who examine how users are affected by the platform um is their work thorough are they experienced is it fair for facebook to throw them under the bus facebook has one of the top ranked research programs in in the tech
03:09:30
industry like they've invested more in it than and i than i believe any other social media platform and the some of the biggest heroes inside the company are the researchers because they are boldly asking real questions and being willing to say awkward truths um the fact that facebook is throwing them under the bus i think is unacceptable and i just want the researchers to know that i stand with them and that i see them or maybe we should say as the name of one book the ugly truth yeah what about facebook blocking researchers at nyu from accessing the platform does
03:10:01
that concern you these are outside researchers um i am deeply concerned so for context for those who are not familiar with this research there are researchers at nyu who because facebook does not publish enough data on political advertisements or how they are distributed these are advertisements that influence our democracy and how it operates they created a plug-in that allowed people to opt in to volunteer to help collect this data collectively and facebook lashed out at them and even banned some of their individual accounts
03:10:33
the fact that facebook is so scared of even basic transparency that it it goes out of its way to block researchers who are asking awkward questions shows you the need for congressional oversight and why we need to do federal research and federal regulations on this very good thank you thank you for your work thanks senator klobuchar senator markey thank you thank you mr chairman thank you for your incredible leadership on this issue um as early as 2012
03:11:06
facebook has wanted to allow children under the age of 12 to use its platform at that time in 2012 i wrote a letter to facebook asking questions about what data it planned to collect and whether the company intended to serve targeted ads at children now here we are nine years later debating the very same issues today ms hogan you've made it abundantly clear why facebook wants to bring more children onto the platform
03:11:37
it's to hook them early just like cigarettes so that they become lifelong users so facebook's profits increase yet we should also ask why in the last nine years has the company not launched facebook for kids or instagram for kids after all from the testimony here today facebook appears to act without regard to any moral code or any conscience or instead puts profit about people profit above all else the reason why facebook hasn't officially permitted kids 12 and under
03:12:10
to use its platform is because the child online privacy protection act of 1998 that i'm the author of exists because there is a privacy law on the books which i authored that gives the federal trade commission regulatory power to stop websites and social media companies from invading the privacy of our children 12 and under that's why we need to expand the child online privacy protection act that's why we need to pass
03:12:42
the kids act that senator blumenthal and i have introduced and why we need an algorithmic justice act to pass because the absence of regulation leads to homing teen stoking division damaging our democracy that's what you've told us today so ms hogan i want you to come back to the protections that you are calling on us to enact this isn't complicated we're going to be told online all day with these paid
03:13:14
facebook people oh congress can't act they're not experts it's too complicated for uh congress just get out of the way you're not experts well this isn't complicated facebook and its big tech lobbyists are blocking my bills to protect kids because it would cost them money that's how complicated it is so let's start with the kids act and senator blumenthal and i that would ban influencer marketing to kids today's popular influencers pedal
03:13:44
products while they flaunt their lavish lifestyles to young users can you explain how allowing influencer marketing to teens and children makes facebook more money the business model that provides mostly a great deal of the content on instagram is one where people produce content for free they put on instagram free no one's charged for it but many of those content creators have sponsorships from uh from brands or from other
03:14:20
affiliate programs um facebook needs those content creators to continue to make content so that we will view content and in the process view more ads facebook provides tools to support influencers and who do influencer marketing because it gives them the supply of content that allows them to keep people on the platform viewing more ads making more money for them yeah so i am actually the author of the 1990 children's television act what does that
03:14:51
do well it says to all the television networks in america stop preying upon children stop using all of your power in order to try to get young children in our country hooked on the products that are going to be sold we had to pass a law that banned television stations from doing this that's why i knew that after my law passed in 1996 to break up the monopolies of the telecommunications industry and allow in the googles and the
03:15:22
facebooks and all the other companies uh you name it that we would need a child privacy protection there because everyone would just move over to that new venue it was pretty obvious and of course the industry said no way we're going to have privacy laws for adults and they blocked me from putting that on the books in 1996. but at least for children i got up to age 12. that's all i could get out of the industry but uh we also know that uh as time has moved on it they become
03:15:54