COMS W3261: Computer Science Theory, Fall 2022

Announcements

General information

This is a three-credit required course for the undergraduate CS program. The course requires Discrete Math (COMS W3203) as a prerequisite (see prerequisite resources below).

Lectures for Section 001 will take place MW 8:40am-9:55am in 501 NWC

Lectures for Section 002 will take place MW 10:10am-11:25am in 501 NWC

The two sections share the same gradescope and edstem pages (both linkable through courseowrks).

Teaching staff and Office Hours

Teaching staff: If you have any administrative questions, you should email the Professor with both head TAs CC'ed or post a private question on Edstem. Questions with sensitive personal information can be emailed to just the Professor.

Office hours: The full schedule can be found in the course calendar:

Note the different locations for different office hours! (CS TA room in 1st floor Mudd, Milstein 502 in Barnard, CSB 488 through 4th floor Mudd, and 514 CSB for Prof. Malkin).

Class description and syllabus

This course is an introduction to models of computation, computability, and complexity. We will ask questions like: How do we define the abstract notion of "computation" or a "computer"? What problems can be computed? What problems can be computed efficiently? Can we prove that some problems can never be computed, no matter how fast our computers become? The course is highly mathematical and abstract; in particular, there will be minimal programming assignments (if any) and homework problems will frequently involve proving or disproving various assertions. Students are expected to be comfortable with the material covered in Discrete Math (W3203) or the equivalent.

Topics (not exhaustive): automata and the languages they define, context-free grammars and the languages they define, Turing machines, basic complexity theory (such as P vs. NP).

Lecture Summaries

Below are brief summaries written shortly after each lecture, of what was covered, and readings. Readings refer to the textbook (Sipser), though sometimes may include other pointers or handouts. You are responsible for what was covered in class, which typically follows the textbook. You should make sure to go over each lecture's material before the following lecture (the corresponding textbook chapters are posted). You may, but don't have to (unless otherwise specified) read ahead for next lecture. See Courseworks for the notes written during the lecture (possibly lightly edited right after). Some lectures will be recorded, when there's a specific reason (eg during jewish high holidays), and recordings posted to courseworks.
  • Lecture 1 (9/7): Introduction to class, motivation and overview. Definitions of alphabet, strings, and languages.
    Readings: Chapter 0 (we didn't go over most of it in class, but this is background material you're supposed to know from discrete math).
  • Lecture 2 (9/12): Defined a DFA and the language recognized by it. A language is regular iff there exists a DFA recognizing it. We saw some examples of regular langauges, and discussed intuition of when a lanaguage is regular (if you can use fixed finite memory to determine whether or not an arbitrary long string is in the language).
    Readings: 1.1 (most of it). Optional reading ahead: finish 1.1, read 1.2.
  • Lecture 3 (9/14): Review and example of DFAs and regular languages. All finite languages are regular (but the coverse is not true). Introduced non-deterministic finite automaton (NFA), acceptance of a string and the language recognized by an NFA. Intution of how to interpret NFA computation on a string (via tokens or computation tree, corresponding to trying all computations in parallel, or "I'm feeling lucky" guessing). In one of the sections we didn't get to the formal definition, will do it next time.
    Reading: 1.2 (first part). Optional reading ahead: last part of 1.1 and last part of 1.2
  • Lecture 4 (9/19): Review of NFAs. Examples. Proved that a language is regular if and only if it is recognized by an NFA (one direction of the proof is easy, the other direction involved the subset construction). Example of the subset construction.
    Reading: 1.2 (second part). Optional reading ahead: finish 1.1 and 1.2.
  • Lecture 5 (9/21): Defined operations on languages (ie sets): complement, union, intersection, concatenation, power, Kleene star. We claimed that the class of regular languages is closed under all these operations, and started to prove it (we showed it for complement, using DFAs, and for union, using NFA).
    Reading: last part of 1.1 and lst part of 1.2
  • Lecture 6 (9/26): Completed the proofs that the class of regular languages is closed under union (two different proofs, constructing an NFA constructing a DFA), intersection (two different proofs, constructing a DFA and using previously proven closure properties and De Morgan's law), concatenation, and Kleene star (in both cases, by constructing an NFA). Defined regular expressions and the languages described by them.
    Reading: last part of 1.1, last part of 1.2, beginning of 1.3.
  • Lecture 7 (9/28): Regular expressions and the langauges they generate. Proved that a language is generated by a regular expression if and only if it is regular: One direction, every regular expression has an NFA, follows from closure properties we already proved. For the other direction we showed that any NFA can be transofrmed to an equivalent regular expression by going through a GNFA.
    Reading: 1.3. Optional reading ahead: 1.4
  • Lecture 8 (10/3): Proved that every regular language satisfies the pumping lemma, intuitively saying that every string in the language that is long enough (above some pumping constant for the langauge) must have a non-empty portion that can be pumped up or down any number of times and still be in the language. Thus, to prove that a language is not regular, it suffices to show that it does not satisfy the pumping lemma. We saw some examples. Also noted that this method works often but not always, as some non-regular languages also satisfy the pumping lemma. We will see more methods to prove that a language is not regular next time.
    Reading: 1.4. Optional reading ahead: 2.1.
  • Lecture 9 (10/5): Reviewed pumping lemma and how to use it to prove a language is not regular, with another example. Noted that some non-regular language (as well as all regular languages) satisfy the pumping lemma. Discussed second way to prove a language is not regular, using closure properties (assume L is regular, combine L with other regular languages using operations for which we have proved closure, to get a language that you already know, or can more easily prove, is not regular, deriving a contradiction). Mentioned the third way, via Myhill-Nerode theorem, that is in the handout but not part of required class material.
    Wrapped up discussion on regular langauges, some applications, and overview of where we are in the class and what is coming.
    Defined context free grammars and the languages they generate, with an example (demonstrating a non-regular language which is context free).
    Reading: 1.4, 2.1, optional handout 4.
  • Lecture 10 (10/10): Context free languages, with several examples. Defined derivation trees (also called "parse trees") and leftmost derivations (each derivation corresponds to a tree; a tree can have many corresponding derivations, but only one leftmost derivation). Defined ambiguous grammars and showed an example ambiguous grammar for which an equivalent non-ambiguous grammar also exists. Defined an inherently ambiguous context-free language and gave an example (without proof). Several other examples of context free languages. Proved that the language of valid regular expressions is context free but not regular.
    Reading: 2.1 (although we skipped the last part on Chomsky Normal Form).
  • Lecture 11 (10/12): Proved that the class of context free languages is closed under the regular operations (noted that it is not closed under intersection and complement, but we will see this later). Used this to prove that if a language is regular, it is also context free (by showing that every regular expression has an equivalent CFG). Defined a right-linear CFG and stated that a language is regular if and only if it has a right-linear CFG (proof omitted, but it is based on showing an equivalence between DFAs and right-linear grammars). Stated the tandem pumping lemma for context free languages, and how to use it to prove that a language is not context free. Mentioned examples of non-context free languages (we will prove next time).
    Reading: 2.3 for the tandem pumping lemma. The rest is not in the textbook: the closure proofs are given as an exercise, and the proof that all regular languages are context free goes through PDAs that we will see next time.
    Optional reading ahead: 2.3. We won't go through 2.4 but you may want to read it for your own interest.
  • Lecture 12 (10/17): Discussed how to use the tandem pumping lemma to prove that a language is not context free, and showed a couple of examples. Proved that the class of context free langauges is not closed under intersection, nor under complement. Defined pushdown automata (PDA) and its computation, and gave an example. We stated (without proof) that a language is context free if and only if there is a PDA recognizing it.
    Reading: 2.3, 2.2.
  • Lecture 13 (10/20): Reviewed PDAs, and wrapped up the context free language portion of the class. Some discussion putting what we learned in context, mentioned some other models and their power, and overview of what is coming in the class. Defined Turing machines (TM) and contrasted them with DFAs.
    Reading: 3.1.
  • 10/24: Midterm 1 in class
  • Lecture 14 (10/26): Reviewed TM definition, defined a computation of a TM on a given input (it can accept, reject, or run forever) configurations of a TM, and the language recognized by a TM. Defined a decider (a TM that halts on every input), decidable language and recognizable language. Mentioned the hierarchy of language classes we've seen so far (regular, context free, decidable, recognizable, all languages -- each of these is a proper subset of the next, as we will see). Showed some examples.
    Reading: 3.1.
  • Lecture 15 (10/31): Defined input-output TM and a computable function. Examples of computable functions that can be used as "subroutines" in the construction of other TMs. Showed three variants of TM models and their equivalence to the standard model: a TM with a "stay put" option, a TM with a doubly infinite tape, and a TM with multiple tapes.
    3.1 and some of 3.2. Definition 5.17.
  • Lecture 16 (11/02): Went over a common error on the quiz. Proved that the class of TM-decidable languages is closed under complement. As we will prove later, this theorem is not true for TM-recognizable langauges. Reviewed the equivalence of the standard TM model to multi-tape TM. We mentioned that our transofmration from multi-tape to one tape incurs a quadratic overhead in running time, and that this overhead is known to be inherent. Defined non-deterministic TM (NTM) and proved its equivalence to standard (deterministic) TM. Our transofmration of NTM to an equivalent TM involved performing a breadth-first search of the computation tree, and incurs and exponential overhead in running time. We mentioned that whether or not this overhead is inherent is a big open problem (basically the P vs NP problem, which we will cover later in the class).
    Reading: some of 3.2.
  • Lecture 17 (11/09): Listed many models that are known to be equivalent to TMs, and stated the Church-Turing thesis, and our shift to identifying algorithms with TMs and using high-level descriptions of algorithms. Reviewed what needs to be proved to show a language is decidable and that a language is recognizable. Proved that a language L is decidable if and only if L and complement of L are both recognizable. Along the way, introduced the idea of running two machines in parallel, or for ever-increasing number of steps. Discussed the important idea that every algorithm/TM can be encoded as a string (or an integer), and thus fed as input to another algorithm (which may manipulate it, run it, try to reason about it, etc). Similarly, other objects (DFAs, graphs, etc) can also be encoded as strings. Showed that the languages ADFA, ANFA, EDFA are decidable.
    Reading: (parts of) 3.2, 3.3, 4.1, Theorem 4.22. If you want to see an explicit specfiic example of how you might encode a TM as a string, you can check section 2 in this document (written by TA Freddy Kellison-Linn who covered a lecture in 2018). We will not care about the specific encoding, just that there's one that can be fixed and algorithmically checked.
  • Lecture 18 (11/14): Review of quiz 6 common mistake. Mentioned (without proof but with a high level overview) that the following languages are decidable: EQDFA, ACFG, ECFG, as well as any language L that is context free. Mentioned without proof that EQCFG and AMBCFG are not decidable. Proved that ATM is recognizable. It is recognized by the universal TM (which encompasses "general purpose computing"), which takes as input a description of an arbitrary TM M and a string w, and simulates M on w. Discussed why the universal TM is not a decider, and mentioned that ATM is in fact undecidable (as we will prove next time). Showed that NETM is recognizable. The recognizing algorithm uses a "dovetailing" technique to run a TM going through all possible strings in a way that makes sure each execution of hte loop takes finite time, but if M accepts any string, the algorithm will eventually accept.
    Reading: 4.1.
  • Lecture 19 (11/16): Brief overview of what is coming up (proving undecidablity / unrecognizability of languages). Defined when sets have the same cardinality (if there's a bijection between them), and defined when a set is countable (if it's finite or there's a bijection from the natural numbers to the set). If A is a subset of B, then the cardinality of A is at most the cardinality of B; If A is a subset of B and B is countable, then A is countable. Examples of countable sets: even positive integers, rationals, the set of all strings over a given alphabet, any language (set of strings) over any alphabet, the set of all TMs, the set of all recognizable languages. Showed Cantor's diagonalization technique, to prove that the set (0,1) of real numbers is not countable. Proved that the set of all languages over any alphabet is uncountable (again via diagonalization). Since there are uncoutably many languages, and countably many recognizable languages, we can conclude that there exists a non-recognizable language (in fact, uncoutably many non-recognizable languages).
    Reading: 4.2
  • Lecture 20 (11/22): Proved that ATM is not decidable (via diagonalization). Concluded that the complement of ATM is not recognizable. Further concluded that the class of recognizable languages is not closed under complement. Proved that HALTTM ("the halting problem") is not decidable (via a Turing reduction from ATM). Defined Turing reductions. If A is Turing-reducible to B and B is decidable, then A is decidable. Equivalently, if A is Turing-reducible to B and A is undecidable, then B is undecidable. Preview of Rice's theorem that we will see again next time.
    Reading: 4.2, 5.1 (beginning), 6.3 (note: Sipser's "informal notion of reducibility" in chapter 5 is the same as the formal notion of Turing reducibility which he defines in section 6.3).
    Some other resources you may want to check out (not required): (a) a (8 minute) video on (un)countability and Cantor's proof, (b) Another (7 minute) video using diagonalization to prove the halting problem is not decidable. (c) this poem-proof for undecidability of the halting problem (again via diagonalization). (Note: we used diagonalization to prove ATM is not decidable, and then showed that if the halting problem HALTTM was decidable, ATM would also be. The above resources directly prove the halting problem is not decidable via diagonalization - this proof is very similar to our poof for ATM.)
    Homework for next time: Watch the following video of my lecture on the topic, from 14:40 to the end (you can watch the beginning too, but that's material we have already covered). This video gives examples of the use of Turing-reductions to prove undecidability, including undecidability of ETM, EQTM and REGTM. You may skip the first proof for REGTM(31:45-37:27) and go directly to the alternative proof.
  • Lecture 21 (11/28): Review of where we are so far and how to prove a language is not decidable. Stated and proved Rice's theorem, and gave an example. How to prove a language is not reognizable (plus bonus material in the notes: refined Rice's theorem).
    Reading: 6.3, problem 5.28 and its solution.
  • Lecture 22 (11/30): Defined a mapping reduction, and proved that if A is mapping reducible to B, A is also Turing reducible to B (a mapping reduction can be thought of as a special case of a Turing reduction where the reduction calls the decider for B once and outputs the same). Proved that A is mapping reducible to B if and only if A complement is mapping reducible to B. For Turing reductions, if A is Turing reducible to B, then all combinations of A and complement of A are Turing reducible to all combinations of B and complement of B. Proved that if A is mapping reducible to B and B is recognizable, then A is recognizable; Equivalently, if A is mapping reducible to B and A is not recognizable, then B is not recognizable. Examples of using mapping reductions to prove unrecognizability: both ALLTM and its complement are not recognizable. Wrapped up the computability portion of the class, mentioned some historical context and other techniques and problems we have not covered.
    Reading: 5.1 (we didn't cover computation histories), 5.3
  • Lecture 23 (12/05): Complexity portion of the class (very abbreviated). Defined running time of a TM as a function of the input length. Defined TIME(t(n)), the class of languages that can be decided in O(t(n)) steps (also gave a reminder of big-Oh notation). Defined the class P of polynomial time computable languages. Discussed robustness of the class P to the details of the computational model, and motivation behind identifying P with efficient computation. Discussed the strong Church-Turing thesis (all reasonable models of computation are equivalent up to a polynomial overhead in running time), and why it may be questionable. Deined non-deterministic time and NP, the class of languages that can be decided by a non-deterministic TM running in poly time. Some examples of languages in P and in NP.
    Reading: 7.1-7.3 (but we skipped a lot)
  • Lecture 24 (12/07): Defined a polytime verifier for a language, taking an input and a short certificate (aka witness or proof) that the input is in the language. NP can be equivalently defined as the class of langauges that can be verified in polynomial time. Example. If a language is in P it is also in NP. The other direction is not known, and is the biggest open problem in computer science: the P vs NP problem. Defined polynomial reduction. If A is poly reducible to B and B is in P, then A is in P. If A is poly reducible to B and B is in NP, then A is in NP. Defined NP-hardness and NP-completeness. If A is poly reducible to B and A is NP-hard, then B is also NP-hard. P is closed under complement, but it is not known whether NP is closed under complement. Examples of NP-complete problems: HAM_CYCLE, Sudoku, SAT. Stated Cook-Levin theorem.
    Reading: 7.1-7.4 (but we skipped a lot)

Quizzes

We will have frequent short quizzes that you can take online on Gradescope. The goal of the quiz is to help us get an idea of how the class is doing and where you are having difficulty, and also to motivate you to prepare before each lecture, and to give you an early signal if you are falling behind. Therefore, the quiz grading is designed to emphasize taking the quiz and making an honest effort, rather than answering everything correctly.

For each quiz that you submit, your score will automatically be increased by 25% of the total point value, up to a 100% ceiling. For example, on a quiz with a maximum of 10 points, a score of 2 will be converted to 4.5, a score of 6.5 will be converted to 9, and scores of 7.5 and above will all be converted to 10. A quiz that is not submitted gets a score of 0. The lowest quiz score will be dropped.

You must take the quiz alone, without discussion with anyone else (although you are welcome to go over lecture material with other students). You don't have to finish the quiz in a single pass - you could submit part of it, and then go back and complete the rest (as long as it's before the deadline, and you do it on your own without any discussion with others). You are allowed to use your class notes/ textbook when taking the quiz, although the intention is that the quiz should be solvable without any materials (after you have gone over the previous lectures and digested them). No late quizzes will be accepted.

The quiz is meant to be easier than homework problems, and meant to be completed quickly - within at most 15 minutes if you've already gone over and understood the class material to date. If you find yourself working much longer, contact us and come to office hours (sooner rather than later).

You will always have at least 24 hours for each quiz.

  • Quiz 1 out 9/12, due Tue 9/13, 11:59pm.
  • Quiz 2 out 9/19, due Tue 9/20, 11:59pm.
  • Quiz 3 out 9/29, due Sun 10/2, 11:59pm.
  • Quiz 4 out 10/13, due Sun 10/16, 11:59pm.
  • Quiz 5 out 10/31, due Tue 11/1, 11:59pm.
  • Quiz 6 out 11/9, due Sun 11/13, 11:59pm.

Homework

Homework should be submitted via Gradescope. Anonymous homework grading is enabled, so please avoid writing your name or other identifying information on the pages you link to the different problems to be graded on gradescope. If you had discussion collaborators, include their name on a separate page of your submission, not linked to any problem. We recommend that you typeset your homework (e.g. using LaTeX), but you may also submit scanned legibly handwritten homework. We will not spend much effort trying to decipher handwriting we find illegible.

See below for our policies on lateness, collaboration, regrade requests etc (quick summary: you have some budget of late hours, you are allowed some limited and clearly acknowledged discussion on homework, you should not cheat, we won't tolerate it).

You will always have at least a week for each full homework (there will also likely be up to two half-sized homeworks before exams).

  • Homework 1 out 9/23, due Mon 10/3, 11:59pm.
  • Homework 2 out 10/9, due Tue 10/18, 11:59pm.
  • (Half) Homework 3 out 10/16, due Fri 10/21, 11:59pm. You may only use up to 18 late hours on this homework (out of your 120 hour total for the semester).
  • Homework 4 out 11/8, due Thurs 11/17 Fri 11/18, 11:59pm.
  • ungraded homework: before the next lecture of 11/28 in the morning watch the following video of my lecture, from time stamp 14:40 to end, skipping 31:45-37:27.
Note: The responsibility for what is written on a submitted homework belongs soley to the student who submitted it. The teaching staff will try their best to help students and answer their questions. However, our goal is not to tell you what to write on the homework, but rather to help you understand the material, to a point where you understand what exactly the question is asking, and how to tell for yourself whether you had succeeded to solve it. In fact, this is one of the class objectives: for you to be able to tell correct solutions from wrong ones, based on the definitions and formal reasoning.

In particular, if a TA (or the professor) tell a student something, even if they accidentally make a mistake in their reasoning, the student is still responsible for catching that (at least if they are going to write something based on it), by fully understanding the problem and solution. We will also not look at your solution in advance to tell you if it is correct or not. Instead, we will try to help you figure out where there is a gap in your understanding, and why you are not confident about your solution (regardless of whether or not it is correct), towards helping you understand the material and do better.

Class policies

Grading policy

The final grade is determined by quizzes (10%), homework (20%), and two exams (70%) (see more information about quiz grading above). The worst quiz of each student will be dropped, and the worst homework of each student will get half the weight (namely, half of the worst homework will be dropped). Being able to solve the quizzes and homework is crucial for understanding and getting comfortable with the material - it is unlikely you will do well on the exams otherwise.

We will grade answers both for correctness and clarity. We grade what you wrote, not what you meant. If you submit more than one solution to a problem, we will grade whichever solution we want (we may choose the worst one or the one that is easiest to grade). If you write "I do not know how to solve this" on a homework or exam problem, you will get 15% of the points for that problem.

We will accept regrade requests for assignments on Gradescope up to one week after grades are released. Please read the published solutions and the rubric before submitting a regrade request. To submit a regrade request on Gradescope, click on the question and click "Request Regrade", then write a few sentences explaining why you think your answer is correct. When we receive a regrade request, if we realize that we made a mistake in applying the rubric, we will correct the mistake. Note that this may result in the grade going down (or up, or no change).

There will be two exams, in class:

  • Exam 1: in class, Monday October 24th.
  • Exam 2: in class, Monday December 12th (last class).
There is no cumulative exam (although the material in general builds on earlier material).

Lateness policy

Late quizzes will not be accepted. For homework, we allow up to 120 hours of lateness with no penalty throughout the semester, provided you use at most 48 hours for any one homework. Any part of an hour (e.g., a minute) counts as a full hour. This lateness time is provided to allow for last minute upload/internet problems, or in case you're busy with another project, observing a religious holiday, have a minor sickness or another issue.

Any homework that is late beyond the 120 hours total or 48 hours per one assignment will not be accepted.

In some cases the above lateness policy may be canceled, and no late homework will be accepted for a certain homework (e.g., before a test). You will be notified of any such case when the homework is given out.

For other emergenices (e.g., a serious family or medical emergency), please provide all necessary documentation to your advising dean, and have the dean contact me to discuss appropriate accommodations. This maintains your privacy, and saves me from having to evaluate and verify your doctor's note or family situation.

Collaboration policy

All students are assumed to be aware of the computer science department academic honesty policy .

For quizzes and exams, no collaboration is allowed.

For homework, collaboration in groups of two or three students is allowed, but not required (and for most students, I would not encourage it in this class). Collaboration in groups of more than three students is not allowed. If you do collaborate, you must write your own solution (without looking at anybody else's solutions), and list the names of anyone with whom you have discussed the problems. If you use a source other than the textbook in doing your assignment, explain the material from the source in your own words and acknowledge the source. In any case, you are not allowed to look at written solutions of any other student, even if you worked on solving the homework together. Collaboration without acknowledgment is considered cheating. Obviously, you are also not allowed to look at solutions from other classes, other institutions, previous years, and so on. If you are in doubt, ask the professor. Homework should constitute your own work.

Every student caught cheating will be referred to Colubmia's office of Student Conduct and Community Standards, as well as subject to academic penalty in the class.

Textbook

Readings and homeworks will be assigned from Introduction to the Theory of Computation by Michael Sipser. Either the Second or Third Edition is acceptable. The book's website and errata are available here. We will generally follow the book in class, but some departures are possible. Students are responsible for all material taught in class.

Optional text: Introduction to Automata Theory, Languages and Computation by John E. Hopcroft, Rajeev Motwani and Jeffrey D. Ullman

Do I have the Prerequisites for the Class?

To check and refresh on your discrete math prerequisite for the class, read Chapter 0 of Sipser's textbook (here is a pdf of this chapter if you don't have the book yet). Quiz 1 will also test some discrete math background. You can also check Tim Randolph's HW0 (here are the solutions). Finally, check out handout 1 below, with a discrete math review sheet.
If you have difficulty with any of the above, and have not yet taken 3203 discrete math, you should drop this class and take 3203 first.

Handouts and Review Resources

Below are handouts generated by the teaching staff as a supplement to class material. These do not constitute required material for the class, but rather meant to support your learning and understanding. Most of the resources below provide further examples and review of what was already covered in class, but some of the resources provide new (and completely optional) more advanced material for interested students. We will also have occasional (and optional) in-person review sessions -- those will also be accompanied by handouts that will be posted below for all students.

Useful and/or Interesting links

  • Tools for drawing finite automata, LaTex, etc:
    • TikZ is a TeX package for programmatically creating graphics which can be used to draw automata. See an example here or here.
    • Finite State Machine Designer is a useful tool to draw automata, which can then be exported to LaTeX (using the TikZ package above).
    • Java Formal Languages and Automata Package (JFLAP) is a useful Java tool that allows you to build automata, test out their functionality, and easily modify them.
    • GasTeX is a LaTeX plugin for easily drawing finite automata and graphs. See also JasTex, a small Java applet for visually drawing finite automata. It automatically generates GasTeX code.
    • Detexify lets you search for LaTeX symbols by drawing them.
    • Other useful LaTeX packages (included with the standard Texlive distribution) include amsmath, amsthm and amssymb (for typesetting equations), and xyfig (for typesetting automata and trees).
  • Turing's paper "On Computable Numbers, with an Application to the Entscheidungsproblem", published 1936. (possibly more readable font for the symbols here).
  • If you want to see an explicit specific example of how you might encode a TM as a string, you can check section 2 in this document (written by TA Freddy Kellison-Linn who covered a lecture in 2018).
  • Some short videos suggested by students in the class to help understand proofs by diagonalization:
    • Numberphile short video on countability and on uncountability of the reals (Cantor's proof).
    • video on the undecidability of the halting problem (via a diagonalization proof).
  • Scooping the loop snooper: A proof that the Halting problem is undecidable, in the style of Dr. Seuss, written by linguist Geoffrey Pullum to honor Alan Turing. Copied from his page .
  • Who Can Name the Bigger Number?: An essay by Scott Aaronson, related to uncomputability and Turing Machines (read about the Busy Beaver problem, and more).