Working from daft form to a final manuscript

I have been really focused on writing an introduction to machine learning syllabus to share with everybody over on my Substack newsletter. Most of my time and energy has gone into that effort. Right now I’m at the point where a draft exists and has been shared out. That is generally a great point in the process. For me it means that I need to let it breathe for a bit and then go back and rework and reread it a few days later. Picking it up with fresh eyes let me catch the little things that otherwise seemed ok in the initial draft. During the course of that process I have learned how to make figures, tables, and generally use the LaTeX syntax. That was indeed a battle and I shared the files for others to be able to take a look at them if they wanted to see how I used the syntax. I ended up having to learn the whole thing from a bunch of tutorials on YouTube along the way each time I wanted to do something new along the way. It was not until the last section in material that I had to learn how to make tables in LaTeX which was shockingly complex compared to what I expected. You have to understand a bit about how the structure works to see how to modify it in practice. 

Part of learning the LaTeX syntax during my journey was learning to appreciate the Overleaf website and how it manages that type of content. At first, I was wondering why this was any different than using the Google Doc or Microsoft Word processing environments. It really is a bit different and it worked out well enough. It is worth the small cost to be able to use it and I can see where having collaborators and sharing a document is something that the platform helps facilitate in a deeply powerful way. Now that the basic draft process on that syllabus is complete it is time to really focus deeply on the “what’s next?” question. Within my research trajectory notes and upcoming research pages on the Weblog I have a few ideas of what I’m working toward creating. At the moment, I’m thinking that my work with machine learning literature reviews is not complete. I may work out a few more deeper looks at some of the topics contained within the syllabus. I am able to format my research notes and literature reviews into LaTeX syntax PDF documents now. 

I read an article over at The Verge that Google is tracking what I’m doing in my Google Doc and that is not entirely surprising. I will say that during the course of writing in my Substack file which is now drafted to week 87 of 104 planned writing sessions the algorithm has gotten better at providing suggestions while I write on edits and matches my phrasing better. That document about machine learning is really close to 100,000 words right now it is at a word count of 96,925. I’m guessing that in terms of purely original technical learning prose creation I’m on the deeper end of the documents they are analyzing. Somebody I’m sure has written something that is longer. They probably have a different writing schedule than I do and the overall feel and style is probably different.