Numbers in context − $1M lump sum, or $1000 per week?

There are some lotteries that allow the choice of $1000 per week for life, or $1,000,000 up front. Let’s look at the math behind it. Now in Canada, prize money is not taxable, so either way there is no tax on the principal amount (once you start earning money on the principal, that money is taxable). One person recently in the news was 20 years old, so well take “life” to mean 60 years on top of that. So is there a better option?

The first option is to take $1M up front. $1M is not really enough to retire on at age 20, so let’s look at some options (you would still need a job, with either option).

If someone were buying a large item, like a house this may be an ideal option. It might leave you with next to no mortgage, which is nice, in fact it might actually save money over the lifespan of the mortgage. For arguments sake, if a house costs $1M, and you need a 20% down-payment regardless, then the mortgage would be for $800,000. At 4% over 25 years, the interest would be around $462K. Not a trivial savings if you pay cash for the house − it’s basically saving $4,208 per month in payments. (Note this is a very simplistic calculation, just to prove a point. Because a fixed rate of 4% for 25 years is unrealistic in any situation, and fixed-rate terms are typically 5 years in Canada).

The alternative of course is to invest the $1M. Now obviously there are a myriad of options here, from risky stock type investments, to more conservative term deposits. It’s hard to look at scenarios here because there are so many options, including whether someone wants a return every year, or just wants the investments to continue growing. Let’s just look at the scenario of a GIC (Guaranteed Investment Certificate), which is fairly low risk (but the money can’t be touched for the term of the investment). Of course any investment gains will be taxed… at what rate really depends on what other income a person has. Then there is also inflation. All investments are compounded annually (for simplicity). In addition it would be approximately $30K a year to add to income come tax time.

 5 years at 3% = $1,159,274 
10 years at 3% = $1,343,916
25 years at 3% = $2,093,777
50 years at 3% = $4,383,906

Of course it’s hard to leave an investment for 10 years without touching it, let alone 50. Of course if you left it for 25 years you would still be $630,000 ahead of the mortgage-free scenario (i.e. you paid the mortgage for 25 years and this is payback). If we factor in inflation of 2% for the 25-year scenario, then the effects of inflation on the principal and interest would mean the total future value is $1,276,222 (Bank of Canada calculator). See, making money from money is tricky. And you still have to pay the taxes on money earned.

Now it is possible to put some money in a TFSA, which is tax-free, but at 20 that’s probably a max of $20,500 if you turned 18 in 2023. Some will say “invest it in the stock market where you can earn 10% and live off the earnings”. Never listen to this sort of advise. The high returns of the stock market come with high risk, and higher taxes on earnings. 10% seems nice, but the wrong strategy can see $1M turn into $200K overnight. I could give a scenario here, but it’s not as simple as compound interest. Needless to say that there is no such thing as ‘fast money’.

The second option is to take $1000 per week for life. That’s $52,000 per year, which is about equivalent to a salary of $70,000 (in Ontario). Over 60 years that’s $3.12M. Like I said, you will likely still need a job, unless you are living off-grid somewhere. That’s tax free money. You could live off it, or pay the mortgage yearly (it just about covers the $800K mortgage), or use part of it and invest part. The good thing is that after 25 years the mortgage is paid, and the remaining 35 years (in a best case scenario) are all gravy. In all likelihood it would allow someone to retire early (because with the mortgage paid, any extra money someone earns can be invested). For example if we put the maximum (current) TFSA contribution $7000, and assume we already have $20,500 then after 35 years at 3% annual growth, the TFSA would be worth $493,615 − tax free.

But let’s not forget inflation. In 25 years the $1000 a week may only be worth $400-$600 in today’s dollars, i.e. you will still get $1000 per week, but it’s purchasing power will be greatly reduced. But if you are young, what $52,000 a year offers is a sense of freedom. This is also a scenario for a 20 year old. Winning the money as a 40 or 50 year old would of course change which option is chosen.

You see there are no easy choices to be made − there is no right or wrong answer, because no-one has a crystal ball to see what the future holds. Every scenario has its pros and cons. But needless to say, if you do win any sort of money, seek the help of a reputable financial advisor from a reputable institution.

Numbers in context − Cookware statistics

Statistics are interesting because they are often used to sway arguments, or entice people. This discussion looks at the impact of normal statistics without an underlying context. Take the example of a cookware company that uses statistics in its advertisements (very few other companies provide statistics of any sort). Some of the statistics are shown below.

  • 103 Michelin-recognized restaurants use the cookware.
  • The cookware is used in over 2,000 global restaurants.
  • 90,000 pans in service.

These numbers may seem significant, but put in context, they are not. For example in the first statement, “100 Michelin-recognized restaurants use the cookware”. Worldwide there are approximately 18,713 Michelin-recognized (not starred) restaurants. So 103/18713 = 0.55%. One would argue that this really isn’t a significant number. If we were to look at only Michelin-starred restaurants, some 3700 worldwide, the number would increase to 103/3700 = 2.78%. The number looks a lot better if we only consider North America, where the number is 1941, so 103/1941 = 5.3% which is 10 times as significant, but still not that much.

For the second statistic, that it is “used by over 2,000 global restaurants”, let’s look at a number of scenarios. This is hard to quantify, because how do you define a “global restaurant”. It has to be different from a run-of-the-mill restaurant, because there could be 15-25 million of those. A better way to look at is to consider the number of professional chefs who run restaurants. Let’s consider just Canada first, where there are 62,200 chefs (2023). This would give us 2000/62200 = 3.2%. If we look at the USA, this number climbs to 286,000, and 0.7%. Neither is earth-shattering, and if we include worldwide restaurants the number just plummets.

The last statistic, 90,000 pans, is quite meaningless. It’s just a reflect of how many pans have been sold, and can’t really be compared to other companies because they don’t publish their data. Some vague figures suggest the Le Creuset foundry in France produces 25,000 items a day. Restaurants use either stainless steel or aluminum pans, and to put the number into context, there are circa 97,000 restaurants in Canada alone (of course the 90,000 pans were not suggested to come from restaurants exclusively).

There is nothing to say that the numbers used in the advertisements are whatsoever incorrect, but they do provide a level of ambiguity because they aren’t given in any frame of reference. Numbers only mean something if put into context.

The age of computer science is waning (mostly)

There was a time when most people considered the future of computer science to be unlimited. A degree in computing was a ‘golden ticket’. But the labour market may have other plans. A recent study by the Federal Reserve Bank of New York highlights a 6.1% unemployment rate for computer science. To put that into context, it’s on par with chemistry, but not as bad as fine arts (7.0%), computer engineering (7.5%), or physics (7.8%). History majors, often lauded by STEM for their “unemployable” degrees are at 4.6%, geography 3.3%, biology 3.0%, and accounting 1.9%. The flip side of the study shows that CS is still in the top three early career median wage at $80K. So remuneration is still okay, but the amount of jobs are fewer, and in reality those jobs will go to the best computer science students. Which makes sense. Things may be a little different in Canada, where there is still a talent shortage.

But the pigeons are slowly coming home to roost. CS may have become a victim of its own success. Maybe because of AI-automation replacing the lower tier of jobs for CS graduates, you know, the ones that likely involve some programming, for people who likely didn’t do too well in their CS degree. AI can do their jobs, it was always just a matter of time before it happened. There is still a talent shortage in Canada, so I wouldn’t say this is anything near the end of computer science, but from here on in graduates will need exceptional skills, and experience, in order to get the good jobs.

How will this affect computer science? Well CS has long been a cash cow for universities. Let’s face it, it’s a STEM that doesn’t need a lot in the way of resources − no large labs beyond a room of computers, and even then most students have their own. There is no need for a chemistry or biology lab, or an engineering workshop. With the knowledge that there are plenty of jobs, schools filled CS degrees up, sometimes with students that were not at all suited to the degree. They often suffer from a lack of real interest, poor problem solving skills, and an inability to program (yet somehow they get degrees). Class sizes have ballooned resulting in a reduced student experience. CS degrees have become a promise of a high paying job for all, or at least that’s how it was marketed.

If you’re doing a computer science degree, as much as I don’t like AI, I imagine your best path forward is a degree that is AI-centric. But again, these are not degrees for people who have the talent to move the industry forward. The days of coding jobs for mediocre CS grads is over. Of course if you want a real AI-resistant job that pays well, there is healthcare, and the skilled trades, especially things like welding (I kid you not).

Has smartphone technology peaked?

Apple just released the latest iPhone, the 17. It’s boring. Well let me put that statement into context, it’s no more boring than the 16, or the 15. I have a 14, and frankly I don’t know what Apple is doing anymore. It was once a company that came up with leading edge, exceptional tech. Now each new release is just blah, underwhelming even. I don’t know why people bother buying it. More AI? Yeah, nobody needs that. More cores? Yeah, no.

I mean to be completely honest most tech companies aren’t producing anything exciting anymore. It’s the same with camera companies, oh another 60MP camera, whoop-de-do. Nothing exciting here, move along… more megapixels is somewhat meaningless today. Like the 48-megapixel camera lenses are a selling point. When is anyone ever going to use 48-megapixel photos on a smartphone? Great to upload to social media − not! The sensor hasn’t changed size, they just crammed more photosites in. The main camera is has a 1/1.28 sensor, meaning it’s approximately 7.5×10mm in size. It relies on a lot of processing to make things look nice. But even with 48MP I doubt future phones will need to go any higher, I mean why? Great for pixel-peeping I guess.

Perhaps the one thing I see with this phone is some emphasis on improved battery life, but you need it given the high end process for the cameras etc. 27 hours of local video playback supposedly, something tom’s guide demonstrates is fairly on the mark. Overall good marks on the battery, tech that has been somewhat complacent over the years.

The rest of the iPhone is basically the same sort of things we’ve seen the past 10 years, perhaps incrementally improved. Don’t get me wrong, I like the iPhone, but I think that technology in general has become so ubiquitous that there is no longer any reason to upgrade a phone too soon. Ask yourself what functions you use on a smartphone? Texting? Video calls? Maps? (and let’s face it, it’s likely Google Maps) Camera? When was the last time you uploaded a truthfully useful app? In fact if you deleted all the apps you never use, you wouldn’t have many left.

Are we now creating tech for tech’s sake? Do we need new phones every year? Likely not. What we need is some leap into the next evolution of communications tech, something I doubt we will see anytime soon. Perhaps a ‘neural iPhone’ that can connect directly to the brain so you can take pictures ‘through-the-eye’?

Is the human mind becoming overwhelmed?

In the modern age it seems like everyone is dealing with some form of mental health issue. Maybe this has always been the case, and we are just realizing it now. Or perhaps it is because once we moved away from ‘survival mode’ a while back (not that far from living memory though), our minds have somehow tried to counterbalance the void. It’s really hard to know. What is certain is that there is far too much information being thrown at us on a daily basis, and it’s hard to imagine that the mind is actually capable of dealing with it all. In the 1970’s there were a few core sources of information, or news: newspapers, TV news, and magazines. If you didn’t actively go looking for news, you could avoid it. Nowadays that’s not true anymore, unless you live an off-line life somewhere in the countryside.

The problem is one of too many data streams. All day long we are attacked by information from every direction, with an increasing amount of miss-information thrown into the melee. It’s hard to believe the human brain can actually handle the daily deluge, and as a result I think it probably triage’s information, as much as it can to avoid an overload. This may be even more problematic for children who are thrown in front of a tablet at age 6 months, and become overstimulated with information at a time when they should not be.

There was a time, not that long ago, when humans had to deal more with daily life and survival than anything else. Their concern was likely the safety and well being of the community around them. While they may have read about events in other places in newspapers, it was often days old, and didn’t really affect their daily lives. Over the years the lag between events and news shortened and with the mainstreaming of TV in the 1950s and 60s changed to almost real-time, albeit usually summarized at the evening news broadcast. Fast-forward to 2025 and we watch events as they happen. There are many places in the world where they still spend every day in survival mode, so have no real time for news from other places.

Not a lot of news is positive in the overall scope of things. Most is negative in some way, and the best way of dealing with negative information is just not to absorb it in any form. Yes, things happen all around the world, but for most people it is impossible to do anything, so filling our minds with additional information does nothing for the well being of our mental health. It would almost be better if we didn’t absorb so much of the world’s happenings on a daily basis.

To be informed, but not overloaded.

Computer-science is cooked (at least for some)

I never thought we would see this day because computer science was suppose to be “the” degree to get. It seemed like a discipline with endless possibilities. But alas what we now have the perfect example of technology, i.e. AI, turning on its creator. Not in any sort of dystopian or bad way, but simply by taking their future jobs. The industry has slowed down over the past few years, and that is really for two reasons. Firstly, technology has somewhat plateaued. Look at the iPhone − what exactly is awe-inspiring about it anymore? Not particularly much of anything. I would upgrade to a new iPhone why? More AI? Please, I like to think.

More importantly though, the industry has become a victim of itself, and by that I mean AI. While AI may be ‘okay’ at answering people’s queries (or rather just finding the answer from a uber-vast database), it is surprisingly good at coding. That means it is ideally suited to replacing the people who created it, or at least the lower rungs of the ladder. The reality is that mediocre computer science graduates are going to find it harder to find meaningful jobs. The days of lucrative jobs in CS for everyone are over, mainly because AI can likely write more efficient code in a more timely manner than a human can. It’s no different to when combine harvesters displaced human power for harvesting grain − machines were just faster and more efficient.

So where does this leave us? Well, it’s a grim time for programmers, well specifically entry-level programmers. This in all likelihood means the number of students enrolled in CS courses will decline over time. College-level courses might be wiped out completely, and some university-level degrees will take a hit. But some of those students likely were on the mediocre side of things anyways (the type that use AI to do their assignments), and so losing them won’t be a big issue. Some have no natural talent for CS, they just do it because they perceive a large salary. But it likely won’t impact people who are problem-solvers. AI might be able to do some of this, but it too has its limitations − at the moment it’s intelligence comes largely from the thoughts of humans.

There will always be jobs for good software-engineers, and people who can think outside of the box. AI can assist with optimization and analysis of software systems, but ultimately, human expertise and insights are still needed to guide development, inspire creativity, and make critical decisions. While AI can mimic creativity it struggles to produce truly original and innovative solutions to problems. Human CS specialists bring a blend of creativity, emotional intelligence, and cultural understanding to the software design process. And honestly reducing the size of CS programs will only benefit those who might want a better learning environment. It may not be great for CS departments, but that is just how things go sometimes.

P.S. Frankly I taught enough students over the years to see the gradual influx of mediocrity when it came to things like programming. Some people just don’t have the skill set to solve problems and implement them in any sort of efficient manner. These are the people who ultimately turn towards AI to solve their problems for them, so effectively the machine is doing their work and they have no intrinsic value as programmers. Good, talented students will always have a bright future in CS.

P.P.S. Will the AI bubble burst? This article from The Telegraph is sombre reading. Apparently the vast majority of AI investments are yielding zero returns.

Predictions of the internet : 1994

“Americans have fallen in love with the on-line computer business in the past year, but it is not yet known whether it will end up fading like the citizens-band radio craze of two decades ago. For America Online Inc, the answer could mean the difference between continued stratospheric growth or a descent to the basement alongside the CB radio and fondue pot.”

Wall Street Journal, March 29, Section B p.1, col.3 (1994)

Rise of the human machines − the ascent of the AI-driven student

At the start of the 2024/25 university year, KPMG released an article suggesting students that use AI say they aren’t learning as much. The article suggest 59% of Canadian students use AI for their schoolwork, which should have alarm bells ringing on every front. I mean everyone was scared that Wikipedia would ruin education, but students still had to find things, now they don’t even need to do that, because AI will. Two thirds admit they are not learning or retaining as much knowledge. Yeah, go figure. Is anyone really surprised?

What are they using it for? The article suggests 46% are using it to generate ideas, and 41% to do research. Wait, generate ideas – isn’t that the whole point of university, further developing thinking abilities? 75% of students say it has improved the quality of their schoolwork – yeah, because they aren’t doing it, AI is. How are you suppose to get better at something if you never fail? By letting AI improve your work, you aren’t actually building any skills.

There are a whole lot more stats in the article, but the one that stands out is that 65% say “they feel that they are cheating when they use generative AI”. Yes, because you are cheating. Look it’s one thing to have AI help with reviewing an assignment, but generating ideas? Even research is iffy – that’s what university is suppose to do, teach you how to do research. What happens if AI doesn’t have the complete picture to answer a question? There are a lot of historic documents that aren’t digitized, how do you account for the information they contain? How about field work, will AI do that for you too?

Why are these students even at university? In most jobs you get paid to think. If you’re using AI, then you may as well be replaced by AI. A 2024 UK survey paints an even bleaker picture. There 92% of students are using AI in one form or another. Why are they using AI, well to ‘save time and improve the quality of their work’. Again dah. But it’s AI doing the work, so maybe it should get the degree?

It’s time to reboot higher education. Yes, there is undoubtedly a place for AI in things like research, grammar etc – but there is no place for it replacing the human brains that are suppose to be learning something. But then again, maybe all is already lost. AI is probably already helping people get degrees who don’t even have a grasp of ‘basic’ English.

When leaders start treating higher education institutions like the learning establishments they should be instead of the mills they are, then perhaps change can occur. Until then, we will just relish the rise of the ‘human’ machines.

Do I miss coding?

Month six of retirement. I honestly haven’t coded for something close to a year now. I don’t miss it. That may seem strange after so many years of writing programs, starting with Pascal in 1987. 38-odd years. A long time to be doing something and then stop. But what is the point of programming anyways? It was interesting when I started, and probably for the next two decades or so. It was all about the problem solving for me. After about 25 years it all started to get less interesting. I can’t say there was one exact thing, but it may be when I realized that designing algorithms for image processing just wasn’t interesting anymore. That’s probably the time I got more interested in the historical aspects of computer science.

Old algorithms are interesting because of the creative process involved in designing them. When sorting was a thing a bunch of people designed sorting algorithms to deal with sorting things efficiently. Then there were decades when little or no new algorithms appeared, or merely extensions of existing algorithms. Now new algorithms for some things are being discovered, but by AI, so not really that interesting. With how fast machines are these days, do these new sorting algorithms really make much of a difference? I doubt it. The exciting part was designing algorithms, especially when there were constraints, such as computing power, or resources. Now it seems, all algorithms are fast.

Designing algorithms for image enhancement also got a little tiring. So many people creating these ‘innovative’ algorithms, but few if any made any real difference. It turns out that enhancement truly is in the eye of the beholder. Even algorithms designed to reduce artifacts like haze in travel photographs is hit-and-miss. Besides which AI probably does a better job now anyway, from the clinical perspective anyway − AI will never be able to look at a picture and decide it “looks good”, mainly because it will neve see things the way humans do (even if we perceive things in a subjective manner).

I thought I might miss programming a little bit. But no. I have no real need to write programs anymore. The one exception might be designing some program to quickly calculate something tedious, e.g. volumes of baking tins. As much as we would like to believe that kids need to learn to code, they don’t. You won’t miss anything in life by not knowing how to code (especially with AI doing the trivial coding now). Better to spend time reading books, forest bathing, and exploring the world.

Life’s too short don’t you know.

Unix was a new style of computing

The UNIX system provided a new style of computing, a new way of thinking of how to attack a problem with a computer. This style was based on the use of tools: using programs separately or in combination to get a job done, rather than doing it by hand, by monolithic self-sufficient subsystems, or by special-purpose, one-time programs
Brian Kernighan, Rob Pike, “Program Design in the UNIX Environment“, in AT&T Bell Labs Technical Journal, October, p.1596 (1984)