For decades, U.S. education has been dominated by the American left. Its stranglehold was highly visible during the Biden administration, with countless stories about wildly inappropriate books in school libraries, critical race theory being taught in classrooms, and national associations calling for parents to be designated domestic terrorists. How did our public school systems —