Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Data Thinning to Avoid Double Dipping in Statistical Analysis - Lecture 1

Broad Institute via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of "double dipping" in statistical analysis and learn about innovative approaches to address this issue in a lecture from the Models, Inference and Algorithms series at the Broad Institute. Dive into Lucy Gao's presentation on data thinning techniques, focusing on Poisson thinning and its generalizations, which offer solutions for avoiding double dipping in unsupervised settings, particularly in single-cell RNA sequencing data analysis. Gain insights into the challenges of sample splitting and discover how data thinning can be applied across various distributions and problem domains. Additionally, benefit from Yiqun Chen's primer on testing data-driven hypotheses post-clustering, addressing the statistical validity concerns in biomedical research when generating and testing hypotheses from the same dataset. Learn about a conditional selective approach for testing differences in means between clusters obtained through hierarchical and k-means clustering, with applications in single-cell RNA-sequencing analyses.

Syllabus

MIA: Lucy Gao, Data thinning to avoid double dipping; Primer by Yiqun Chen

Taught by

Broad Institute

Reviews

Start your review of Data Thinning to Avoid Double Dipping in Statistical Analysis - Lecture 1

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.