Title photo
frugal technology, simple living and guerrilla large-appliance repair

Regular blog here, 'microblog' there

Many of my traditional blog post live on this site, but a great majority of my social-style posts can be found on my much-busier microbloging site at updates.passthejoe.net. It's busier because my BlogPoster "microblogging" script generates short, Twitter-style posts from the Linux or Windows (or anywhere you can run Ruby with too many Gems) command line, uploads them to the web server and send them out on my Twitter and Mastodon feeds.

I used to post to this blog via scripts and Unix/Linux utilities (curl and Unison) that helped me mirror the files locally and on the server. Since this site recently moved hosts, none of that is set up. I'm just using SFTP and SSH to write posts and manage the site.

Disqus comments are not live just yet because I'm not sure about what I'm going to do for the domain on this site. I'll probably restore the old domain at first just to have some continuity, but for now I like using the "free" domain from this site's new host, NearlyFreeSpeech.net.

Mon, 26 Feb 2018

If not SICP, then what? Maybe HTDP?

Hard-core CS geeks worship at the altar of SICP, the acronym for the seminal Structure and Interpretation of Computer Programs by MIT's Harold Abelson, Gerald Jay Sussman and Julie Sussman, the functional-programming deep dive into Scheme and head-scratching problems that set the curriculum for "beginning" computer science and engineering students at the Massachusetts Institute of Technology from the 1980s onward.

While SICP (yes, everybody uses the acronym) is considered a holy grail for serious computer scientists, it's hella hard to figure out. For me, programming "methods" that make me jump through intellectual hoops made of mathematics make it hard for me to learn the actual programming.

Confession time: I have the same problem with Robert Sedgewick and Kevin Wayne's Computer Science (aka Introduction to Programming in Java). The examples the book presents and the problems it asks students/readers to solve involve a lot of thinking about the mathematical nature of the problem, and the nitty-gritty of making programs seems to fade into the background. That's why books like Y. Daniel Liang's Introduction to Java Programming seem to meet up with where I'm at in terms of learning programming.

But what about the lofty, Lisp-y ideals of SICP? Even MIT has moved on, and the school itself has a contingent offering and developing another Scheme/Lisp-driven textbook, How to Design Programs (read the latest edition for free, and get the print book from MIT Press).

So why the shift away from SICP? The professor-authors of How to Design Programs lay out their reasoning in this 2004 paper, The Structure and Interpretation of the Computer Science Curriculum.

Here is the abstract from the paper written by the HTDP author-professors (and yes, HTDP, sometimes HtDP, is the accepted shorthand; books with fervant fans get acronyms):

Twenty years ago Abelson and Sussman’s Structure and Interpretation of Computer Programs radically changed the intellectual landscape of introductory computing courses. Instead of teaching some currently fashionable programming language, it employed Scheme and functional programming to teach important ideas. Introductory courses based on the book showed up around the world and made Scheme and functional programming popular. Unfortunately, these courses quickly disappeared again due to shortcomings of the book and the whimsies of Scheme. Worse, the experiment left people with a bad impression of Scheme and functional programming in general. In this pearl, we propose an alternative role for functional programming in the first-year curriculum. Specifically, we present a framework for discussing the first-year curriculum and, based on it, the design rationale for our book and course, dubbed How to Design Programs. The approach emphasizes the systematic design of programs. Experience shows that it works extremely well as a preparation for a course on object-oriented programming.

The paper gets right into what we might want to call "the SICP problem" (slight deletions marked by ellipses serve to remove references to charts, and emphasis is mine):

More generally, SICP doesn’t state how to program and how to manage the design of a program. It leaves these things implicit and implies that students can discover a discipline of design and programming on their own. The course presents the various uses and roles of programming ideas with a series of examples. Some exercises then ask students to modify this code basis, requiring students to read and study code; others ask them to solve similar problems, which means they have to study the construction and to change it to the best of their abilities. In short, SICP students learn by copying and modifying code, which is barely an improvement over typical programming text books.

SICP’s second major problem concerns its selection of examples and exercises. All of these use complex domain knowledge. ... Some early sections and the last two chapters cover topics from computer science ...

While these topics are interesting to students who use computing in electrical engineering and to those who already have significant experience of programming and computing, they assume too much understanding from students who haven’t understood programming yet and they assume too much domain knowledge from any beginning student who needs to acquire program design skills. On the average, beginners are not interested in mathematics and electrical engineering, and they do not have ready access to the domain knowledge necessary for solving the domain problems. As a result, SICP students must spend a considerable effort on the domain knowledge and often end up confusing domain knowledge and program design knowledge. They may even come to the conclusion that programming is a shallow activity and that what truly matters is an understanding of domain knowledge. Similarly, many students lack an understanding of the role of compilers, logical models of program execution, and so on. While first-semester students should definitely find out about these ideas, they should do so in a context that reaffirms the program design lessons.

In summary, while SICP does an excellent job shifting the focus of the first course to challenging computer science topics, it fails to recognize the role of the first course in the overall curriculum. In particular, SICP’s implicit approach to program design ideas and its emphasis on complex domains obscures the goal of the first course as seen from the perspective of a typical four-year curriculum.

The HTDP professors want to make clear that the secret sauce is not Scheme, per se. They use a tuned subset of the language, something made possible by the Racket IDE (formerly DrSceme, now DrRacket):

Combining SICP with a GUI-based development environment for Scheme won’t work better than plain SICP. The two keys to our success were to tame Scheme into teaching languages that beginners can handle and to distill well-known functional principles of programming into generally applicable design recipes. Then we could show our colleagues that a combination of functional programming as a preparation for a course on object-oriented programming is an effective and indeed superior alternative to a year on just C++, Java, or a combination.

This is a deep topic, and I'm more concerned with figuring out the best way to learn programming concepts and practices without going too deep into the quirks of a single programming language, or hitting roadblocks in the form of scientific and mathematical concepts that are ancillary to the practice of programming itself.

Of course there's always the pull between the foundational approach favored by HTDP and maybe even in the "major" Java and C++ textbooks and college courses that use them, versus the practical approach of "here's how to make a program that actually does things ... learn these in-demand frameworks ..." and more urgently, "develop a foundation in JavaScript/Ruby/Clojure and go forward from there."

Sun, 25 Feb 2018

The Symbolics Lisp Machine was a real thing

I came across this fascinating article from LispODROID, Ergonomics of the Symbolics Lisp Machine - Reflections on the Developer Productivity, about an actual computer where the systems software and applications are coded in Lisp, with the machine's primary use case appearing to be coding more things in Lisp.

Here's a picture of one model of the Symbolics Lisp Machine, which the article says was sold between 1981 and 1993:

A Symbolics Lisp Machine

Being somewhat fascinated by Lisps (including Scheme) and Clojure), I was interested.

A look at the Wikipedia page for Symbolics told me that while the company was headquartered in Cambridge and later Concord, Massachusetts, it made the Symbolics Lisp Machine in the San Fernando Valley community of Chatsworth, long known as a home to warehouses and manufacturing facilities. (Also pornography. But that's pretty much the whole Valley then and now, but especially then.) The Chatsworth facility closed in July 2005.

In the beginning, Symbolics Lisp Machines cost $70,000. Each. In 1980s money.

Among the other tidbits: Symbolics.com is the first-ever registered .com domain. (It might be the first domain name of any kind ever registered, but that seems like a harder claim to prove.)

Symbolics was born at the MIT Artificial Intelligence Lab, and a desire to keep the Lisp code away from another MIT-launched company, Lisp Machines Inc., led MIT's Richard Stallman to create the free software movement.

I guess more than anything I'm fascinated by computer systems that really went their own way in terms of conception, design and philosophy. You can code in Lisp today on pretty much any computer, but a Lisp Machine? That's something you don't here about today in our Unix/Windows/nothing-else world.

Some other links:

I reserve the right to add to this article.

Thu, 15 Feb 2018

I am trying to make the Xenialpup version of Puppy Linux work for me

Since I am not anxious to either dual-boot or replace Windows 10 with Linux on my main laptop (HP Envy 15-as133cl 15t), I thought I would go back to the live distribution that introduced me to Linux -- Puppy -- and see if I could make it do all the things I want and need it to do.

My history with Puppy goes WAY back. I remember running version 2.13 on all kinds of hardware -- castoff laptops, converted thin clients. It would run on anything.

Back then I booted Puppy from CDs. Nowadays, with "modern" Puppy I could boot from USB and save either to the laptop's hard drive or another USB drive.

I downloaded the latest Puppy, Xenialpup, and used the Fedora Media Writer in Windows to put it on a 4GB USB flash drive. Xenialpup is UEFI-compatible, and after turning off Secure Boot on the laptop, I was off and running the super-fast JWM desktop that has traditionally anchored the Puppy live system.

Surprisingly, I was able to get a pretty comprehensive Ode blog-posting setup going. I have Unison: I pulled version 2.40 from an old Ubuntu package and used dpkg-deb to extract the binary since I need the old version and not the Ubuntu Xenial version of Unison, which is 2.48. (On a related note, why is it so hard to get a working Unison 2.48 for CentOS?)

I pulled all of my scripts from the Windows Subsystem for Linux, and I've already synced the blog over to this system.

Thankfully, Puppy's unusual running-as-root way of working didn't screw up my file ownership on the server, or in my Windows Subsystem for Linux copy of the filesystem.

There was one old problem from Linux that I haven't had to deal with in a while: While typing, I had the typical "jumping" behavior due to "interference" with the touchpad. Though Puppy doesn't have a "disable the touchpad while typing" setting, I was still able to solve the problem quickly by disabling tap-to-click with the system's straighforward Input Wizard configuration utility.

Where I ran into problems was with my development environment. I got Ruby to work in Puppy after a little effort, but I couldn't get the equivalent of ruby-dev to build things like the Ruby Twitter gem, and the gem itself, which is a package in Ubuntu, is not easily available for Xenialpup.

So I can post to the blog but can't use my "new" social-posting routine that turns URLs into Ode-compatible files and then sends the results to both the blog and Twitter.

I can't run all the gems I want in Ruby. But I am also exploring Clojure, so maybe that would work.

I did manage to install the JDK, though I had to add it to my path, opting to do so in /etc/profile, in order to get it to work.

I downloaded the Leiningen script and installed it. I can create new Leiningen projects, but the REPL errors out in spectacular fashion.

I haven't tried to get Node working yet, but the Ubuntu version is old (4.2.6), and I'm not confident that this installation will play well with the overall npm ecosystem. I could try to download and install direct, but there's potential for a lot of trouble.

The mechanics of Puppy are working great. Despite the unorthodox packaging and software installation, I did manage to get quite a few things working. I have my favorite text editor (Geany), the full LibreOffice suite via SFS package, the Firefox browser with all of my bookmarks synced, and a super-fast desktop packed with a lot of (mostly) home-grown utilities.

But I couldn't get Puppy to go the last mile.

So the "story" for me and Puppy Linux in 2018 is good until the point at which I want to do development work. At that point it's better to stay in Windows (and the WSL) or opt for a traditional Linux installation.

Now if only there was a "full" Debian live environment that allowed for persistence. Maybe there is. I'll be looking.

Showtime's 'I'm Dying Up Here': Comic timing ... and parking

So I've been watching Showtime's "I'm Dying Up Here," the excellent series about LA's 1970s stand-up comedy scene and its "Carson's couch means you've made it" vibe, and all I can think about is how easy it was to park your car in LA back in those days.

Wed, 07 Feb 2018

Pragmatic Programmers, No Starch Press, Manning Books, Packt and Apress fill void left by O'Reilly Media

It wasn't a total surprise in mid-2017 that O'Reilly Media was retreating from tech-book publishing, especially when it comes to offering digital books. It's been more than a while since the company announced that it would no longer sell books -- digital or print -- directly and would instead offer all of its books through the Safari online service and in print via Amazon.

Here are two articles from O'Reilly Media on their decision:

If you look at it, this decision by O'Reilly to exit the direct-sales market didn't just happen out of the blue in June 2010. It was a slow decline over years. O'Reilly (and most other publishers, really) used to offer a whole lot of books on Linux and Unix, and even more on individual programming languages.

O'Reilly really slowed down what it did for Linux (pretty much stopping that category) and put out programming books with longer and longer gaps in between.

Announcing that they would no longer sell directly to readers was just the end of what had already been in motion for a long time. I don't begrudge O'Reilly its sizable conference business, which I'm sure "carried" the book publishing part of the company for years. I also can't fault them for pulling back. It's a business decision. Sure, they are killing their brand, but they don't see it that way. Maybe a brand isn't so valuable if you can't make money with it.

Luckily we have a few tech-book publishers who are doing a great job filling the sizable gap (and, if I'm right, sizable market) left by O'Reilly.

In no particular order, my favorite tech publishers are Pragmatic Programmers, No Starch Press and Manning Books.

On the second tier are Packt and Apress.

I'd also like to mention Leanpub, which is more of an author "platform," but has a lot of content I like.

And then there's the rest of the self-published tech-book world, which you can tap into via Amazon and all over the Web. In particular, a lot of academics are publishing extremely valuable books for free via PDF, sometimes also publishing in print. I've even heard some authors for the "real" publishers say their books sell better when they're also available for free in electronic form. Maybe you get such a big audience for a free book that the percentage who might want a paper copy is bigger than the number who are willing to take a chance on a book that they can't yet read.

What O'Reilly Media used to "bring" was a guaranteed level of quality. I think that PragProg, Manning and No Starch are continuing that tradition. They're "gatekeepers" in the sense that you know it's going to be a good book that really offers value. I don't always get that sense with Packt books. Apress seems to bring a good level of quality, but it's not at PragProg/Manning level.

I'm sad that O'Reilly went all in with their digital subscription model, but they were fading from relevance for years before they made that move. That makes me happy to support the current publishers who are keeping quality high and selling direct to their readers.