Harvard Fake Data Scandal - Huge New Development
https://youtu.be/KcFbb5lVJA4
Partial Transcript
There has been a new major development in the Harvard fake data Scandal involving Professor Franchesca Gino.
The development I’m talking about is the full report from Harvard’s internal investigation into her has been released and made public. This report is important because it sits at the heart of G’s $25 million lawsuit for defamation against Harvard University in data colada and so the media has gone through the report including myself and there’s one detail that everyone is just absolutely fascinated by and I want to share that with you today.
In order to do so, I think it’s necessary to recap this entire case at a very high level quickly just so we’re all on the same page. So here’s the recap.
Franchesca Gino is a top professor of Behavioral Science at Harvard University. She is one of the most well-respected, well-known and highest earning professors in all of my field of Behavioral Science but after much success things start to go wrong for her when anomalies were found in her data by these three guys. These three guys were in a blog called Data colada and when they found anomalies in data and what looked like evidence of data fraud rather than publish it on their blog, they first went to Harvard University to tell them about their findings. Harvard looked at their findings and said okay, don’t publish anything on your blog yet. Let us first do an internal investigation into Franchesca Gino. Then if the internal investigation confirms your findings then you can publish it on your blog data colada agreed and that’s exactly what happened.
Harvard then launched a full internal investigation into Franchesca Gino process which takes a total of 11 months and after the investigation is finished the dean of Harvard Business School says I have to accept the conclusion of the investigation and puts Franchesca Gino on unpaid administrative leave. I should add then data colada published their four blog posts, each one corresponding to a different paper published by Franchesco Gino. These four blog posts go crazy viral. They’re all over social media and it seemed like pretty much every major publication was running this story and in the process Franchesca Gino’s name and reputation essentially gets trashed so as you can imagine.
Gino isn’t happy about this. She claims and she has always claimed, that she’s innocent. So she decides to sue Harvard Business School, her former employer and the guys who run the data colada blog for a headline grabbing $25 million. She claims their accusations are biest, sexist and false and therefore, she’s suing them for defamation.
So that’s essentially where we’ve been until this week but because the Harvard internal investigation is at the center of
Franchesca Gino’s lawsuit. Harvard filed to have the investigation more more widely known. After they did, the media actually stepped in alsso. Specifically, it was the reporters committee for the freedom of the press and the New Yorker. These two organizations intervened to not only request the internal report be more widely known but to actually be put into the public record. Just this week at the time of filming, a federal judge has made it so,
What did we learn from the release of Harvard’s internal report? Well firstly we learn that it’s really long. Iit was a 1300 page report but there is just one part of this report that everyone is absolutely fascinated by including myself. The biggest new revelation we learn from this report is Franchesca Gino actually accused one of her co-authors of being the reason for the anomalies in her data. To be more specific, Franchesca Gino has accused one of her female co-authors on the 2012 paper for sabotaging her and being the reason why there are anomalies in that 2012 paper.
So in the rest of this video, I want to do a deep dive into that 2012 paper. Let’s remind ourselves of what the study was about, what the accusations were and then finally talk about this new accusation that one of her co-authors is responsible for tampering with the data. I’ll give you my two cents based of what I know about these co-authors and about how credible I think that is. Let’s talk about this 2012 paper by Franchesca Gina.
This 2012 paper was looking at whether an honesty pledge could make somebody more honest when filling out a form. Specifically they were testing whether putting an honesty pledge at the top of a form would be more effective than putting an honesty pledge at the bottom of the form. And while you might think that is a stupid thing to research and a waste of dollars, actually it’s something which makes a lot of sense in Psychology. You see in Psychology, there is this idea of self-consistency bias. This is the theory that as humans, we strive to be consistent with our previous actions. So the idea behind this intervention is that if we put an honesty pledge at the top of a form then that means when you go to fill out the rest of the form, you will instinctively try to be consistent with your action of signing that Honesty pledge which will make you more honest whereas if the honesty pledge is at the end of the form then you don’t have that self-consistency bias working for you so therefore people will be less honest. So from a psychological point of view, it does actually make sense to do this and certainly a lot of people were interested in this idea because it’s such an easy thing to do. So in the 2012 paper they were testing the same idea in many different scenarios and the scenario which Franchesca G was in charge of was the scenario where participants were brought in to do some math puzzles. So the way this study worked was participants would be brought into a room to do a certain number of fairly simple math puzzles in a set amount of time and they were told that for each math puzzle they solved they would be rewarded with $1. But the way this study worked is that after they had finished their time doing the math puzzles, rather than hand in their worksheet to be marked and then paid accordingly, they would go to the back of the room to shred their worksheet and then fill out a separate form where they would write down how many math puzzles they solved and therefore, how much they deserve to be paid. So it was this second form which had the manipulation. Some students got the honesty pledge at the top of this form and some students got the honesty pledge at the bottom of the form. The authors were looking to see if the people who got the honesty pledge at the top of their form actually reported more honestly. Now you’re probably wondering how can they tell how honest the participants were are if they shredded the original worksheets? Well, that’s because according to the authors who did the study, the shredder was rigged. Here’s Dan Arieli,one of the co-authors, explaining the shredder only shred the side of the page but left the body of the page intact. That is how they were supposedly able to tell how honest people were. What were the results? Well according to what was originally published in this study, apparently by simply moving the honesty pledge from the bottom to the top reduced cheating from 79% to 37%, an absolutely huge effect size. Let me tell you, people loved this idea. It’s one of those ideas in Behavioral Science that is just so simple to do yet apparently has such large effect sizes that it really captures people’s imagination. So many companies and organizations have forms they need people to fill in and the idea you can simply put an honesty pledge at the top of the form and you get these amazing results in the quality of the forms it was like a no-brainer thing for loads of people to do and so this idea gets really popular.
However, what happened was the three guys at data colada grew suspicious of this paper and so they decided to look into the data. After obtaining the original data set from this study they found some weird anomalies in the data. If you look at this screenshot of the data and specifically look at the P column which indicates the participant ID, the unique ID given to every participant who enters this study. Because it’s a unique ID, there should be no duplicates. Additionally, if you look at how the data has been sorted you can see it appears to have been sorted so all of condition one are together then all of condition two are together and then within each condition it’s sorted in ascending order aka the numbers get bigger as you go down. That appears to be how the data has been sorted but if you look at the phash column, there are some weird anomalies highlighted in yellow. Here you can see 49 is there twice, a duplicate ID which should never happen and then yjere are these other numbers highlighted in yellow. They appear to be out of sequence. If you think they’re being sorted in ascending order and so Data colada flagged these rows as suspicious because they look like somebody has tampered with this data and added in rows or moved them around to places where they shouldn’t be. The Data colada guys wanted to know if these suspicious looking rows produce a strong effect in favor of the author’s hypothesis, so they plotted the data and they marked the suspicious rows with a circle and an X, As you can see from this graph they published on their blog, it looks like the suspicious row favor the hypothesis very strongly indicating these suspicious rows are showing a bigger effect than what was really observed in the real data.
Now this story was very convincing but a few weeks after Gino filed her lawsuit she actually started her own website. When she was talking about this case and she wrote a post defending herself about these anomalies that were found in this 2012 paper, she said these are likely explained by her research assistant who was helping her with this study just mishandling the data when they were entering it into the spreadsheet. After all the study took place in person and then the data had to be manually imported into a spreadsheet or a database so that’s what we thought.
Gino’s defense was for this accusation, “it wasn’t me it was my research assistant”, is a pretty classic excuse which every major academic gives whenever they’re accused of data fraud but now that this full Harvard report has been released, we know this is actually only one of two explanations Gino gave to the investigators. She said it could have been a research assistant or it could have been my female co-author on the paper who both had access to my data and a motive. Man this is like a proper True Crime for nerds
Source: http://mechonomic.blogspot.com/2024/03/harvard-fake-data-scandal-huge-new.html
Anyone can join.
Anyone can contribute.
Anyone can become informed about their world.
"United We Stand" Click Here To Create Your Personal Citizen Journalist Account Today, Be Sure To Invite Your Friends.
Please Help Support BeforeitsNews by trying our Natural Health Products below!
Order by Phone at 888-809-8385 or online at https://mitocopper.com M - F 9am to 5pm EST
Order by Phone at 866-388-7003 or online at https://www.herbanomic.com M - F 9am to 5pm EST
Order by Phone at 866-388-7003 or online at https://www.herbanomics.com M - F 9am to 5pm EST
Humic & Fulvic Trace Minerals Complex - Nature's most important supplement! Vivid Dreams again!
HNEX HydroNano EXtracellular Water - Improve immune system health and reduce inflammation.
Ultimate Clinical Potency Curcumin - Natural pain relief, reduce inflammation and so much more.
MitoCopper - Bioavailable Copper destroys pathogens and gives you more energy. (See Blood Video)
Oxy Powder - Natural Colon Cleanser! Cleans out toxic buildup with oxygen!
Nascent Iodine - Promotes detoxification, mental focus and thyroid health.
Smart Meter Cover - Reduces Smart Meter radiation by 96%! (See Video).