College Studies and IT Knowledge Go Hand-in-Hand

Colleges should mandate IT education.

As I write these words, we're in the middle of moving my son into his college dorm in downtown Boston, where he's attending Emerson College as a film studies major. It's quite a scene around Boston Common and the Boston Gardens, where thousands of students are moving into their dorms at Suffolk University, Emerson College, Anatolia College, Tufts University, Fisher College, University of Massachusetts, and more.

As we were getting son Gregory moved in on Friday (Sept. 1) and continuing through yesterday (Sunday, Sept. 3), we saw constant lines of students queueing up at downtown dorm entrances. waiting for their turns to schlep their possessions into their respective habitations. A Sunday walk to the Isabel Stewart Gardner museum in Boston's Back Bay showed us the same scenes unfolding at Northeastern University (undergrad student body size: about 16,000) as well.

Thus, I wasn't surprised to learn that Boston is second only to Los Angeles in terms of the ratio of college students to overall population (7.32 percent for Boston vs. 7.34 percent for Greater L.A.). All this frenetic activity got me to thinking: What do today's college students need to know about IT to do well in their jobs, once they enter the workforce?

Just the Basics, Please!

Colleges should mandate IT education.

My gut feel is that every college students needs basic computing skills and a modicum of IT understanding to function as a useful member of the workforce, no matter what their chosen course of study, or — perhaps more importantly — no matter what field they wind up working in.

As somebody with multiple degrees in anthropology, and another one-and-a-half degrees in computer science, I can attest to the value of such skills and knowledge from more than 30 years of experience in and around information technology and computing. In fact, what led me to switch from anthro to CS was nothing less than "brute force economics."

The year I became eligible to start working on a Ph.D dissertation, there were exactly two (2!) entry-level teaching jobs available for newly-minted anthropology instructors. With at least 30 others projected to earn that degree the year I expected to finish up (1983, now almost lost in the mists of time) the odds were not very favorable. Bearing all of that in mind, I switched to CS and have worked in and around computing ever since.

In my opinion, what ALL students need to know about computing and IT when they get out of school is neatly captured in four CompTIA certifications. Namely:

CompTIA ITF+ (IT Fundamentals)
CompTIA A+
CompTIA Network+
CompTIA Security+

The last three items are sometimes called the "CompTIA Trinity" (or some variation thereof) because they provide basic background in PC, networking, and security fundamentals that everybody needs to know. To that mix, the ITF+ brings an overview of basic IT principles, practices and processes to help people understand what IT can do for any organization, and how best to put its capabilities to work.

Playing College Calculus

Colleges should mandate IT education.

The typical undergraduate curriculum requires 120 credit hours to graduate. I'd rate the hours equivalent of the A+ at 6 hours, and the remaining three classes at another 3 hours each. Lab requirements might bump everything by an hour (two for the A+, which might cover two semesters rather than one, with all the others as single-semester courses).

As a fraction of 120, 15-to-19 hours represents between 12.5 percent and 16.8 percent of the total load. This raises the question: Is more than 10 percent of the basic requirement to graduate properly focused on IT/computing for all students?

In my honest opinion, I believe that answer is "yes" because of the importance of computer understanding and literacy for all workers, especially those of the white collar kind. And if you look at the cost of the exams, courses, and practice tests involved (see my Aug. 25 column "Getting Started Down the IT Trail" for those details), then the time and effort involved fits a general undergraduate program of study surprisingly well.

Not only that, but the cost is surprisingly affordable — we're paying about $2,000 per credit hour at Emerson, for example — and is often lower than typical per-credit charges at many colleges and universities.

Do I think it's likely that most institutions will build this kind of thing into their curricula? Alas, probably not. But because it adds relatively little to the overall cost of a bachelor's degree — about $4,000 including courses, practice texts, textbooks, and exams, by my reckoning — it may be worth throwing it into a typical 4-year program as part of the mix anyway.

I'm planning on having this very chat with my own offspring at the next opportunity. Other parents, guardians, friends and family of college students facing uncertain or challenging employment prospects may want to do likewise.

MORE HISTORIC HACKS
Would you like more insight into the history of hacking? Check out Calvin's other articles about historical hackery:
About the Author

Ed Tittel is a 30-plus-year computer industry veteran who's worked as a software developer, technical marketer, consultant, author, and researcher. Author of many books and articles, Ed also writes on certification topics for Tech Target, ComputerWorld and Win10.Guru. Check out his website at www.edtittel.com, where he also blogs daily on Windows 10 and 11 topics.