Computing Innovation Stagnation

Discussions debate whether there have been fundamentally new concepts or significant progress in computer science since the 1960s or 1970s, contrasting rapid early advancements with perceived recent stagnation and referencing figures like Alan Kay.

📉 Falling 0.5x Science
2,779
Comments
20
Years Active
5
Top Authors
#8602
Topic ID

Activity Over Time

2007
22
2008
23
2009
61
2010
80
2011
106
2012
96
2013
139
2014
93
2015
141
2016
158
2017
163
2018
161
2019
162
2020
181
2021
235
2022
250
2023
267
2024
190
2025
230
2026
21

Keywords

IT JS ALGOL youtube.com GDP MIT ENIAC UI JVM AI computing science computer science computer computers alan kay ideas past kay swim

Sample Comments

chimpburger Oct 31, 2019 View on HN

There haven't been any fundamentally new concepts in computing for decades

larodi Jun 14, 2017 View on HN

this one time when you realize how a working solution can be cool 30 years after, even improved and still working to the benefit of humanity. do we need to rush with all new these paradigm shifts... maybe evolution of computing needs to take a breath and introspect on all 20th century achievements and improvements before reinventing the wheel.

timf Jul 6, 2009 View on HN

> Really, whatever can be done now with a computer was in principle doable in 1965Why are people so attached to this line of reasoning... it's like saying whatever can be done now with an automobile was in principle doable in 1920 because the laws of mechanical engineering haven't changed. Cars now are "just" faster and have a few more doohickeys on them.It's sort of a tautology once you make the paradigm in question all-encompassing enough. I would love to see mind-erasing ad

iamwil Jan 24, 2018 View on HN

I hear this notion bandied about once in a while. I don't think it's strictly true.It's just something you say to express frustration about things you don't like broadly like about tech, by comparing it to an idealized version of the past. There were plenty of frivolous things back in the 60's. We just don't remember them as no one wrote about it that survived so we can read it.As for some modern-day people doing progressive work in computing that should chang

otabdeveloper4 Nov 30, 2020 View on HN

Did you read my comment? There are no "new" ideas in computing. Everything has already been invented in the 1960's. It's the finesse of execution that matters, not ideas.

sametmax Oct 23, 2017 View on HN

Computing as we know it is not even a century old and we already had several radically shift in it.Compared to house building, medicine or transportation... We had millenniums of experience with them and yet we don't revolutionize them every decade.Things take time.Let's try find balance in what we got already. Once we have a hold on that, we'll have plenty of opportunity for the rest. Innovation is not an endangered concept in the human specie.

iamgopal Jun 23, 2021 View on HN

video from the past were amazing, considering it is just in couple of decades after invention of transistor. ( I think that is because before invention of transistor, we already had mathemetical foundation of computation almost ready. ) in current era, do we have something that needs to be invent ? ( i.e. different part of the science can predict it, but the actual part of the science is falling behind ? )

bsaul Oct 22, 2017 View on HN

i wonder what's today's equivalent to this pionnering technologies...biotech gene editing ? quantum computers ? artificial intelligence ? Cars ?Is there someone somewhere discovering the ubiquitous usage of the next decades ? Does computer science still has the potential to bring the same kind of world-changing tools ?Here's something i'd like to ask AlanKay : at that time, you probably had the feeling that you were working on groundbreaking technologies, but what we

nickpsecurity Feb 22, 2017 View on HN

Cloud computing is the mainframe model. ALGOL was safe, maintainable language for systems with LISP being ultimate for scripting or flexibility. Current languages mostly do what the could already do. High-performance, NoSQL databases were how mainframes started. Clustering & early distributed systems by Dec's VMS & Tandem NonStop. Xerox PARC came up with desktops. Englebart's people demonstrated all kinds of stuff in 1968:<a href="https://en.wikipedia.org/wiki

Someone Jan 28, 2021 View on HN

Title of this HN post is correct in the sense that it’s the title of the referenced article, but that title is wrong for the article’s content.The article claims computer science hasn’t moved forward (much) since 1978, claiming that the changes in the 40 years between 1940 and 1980 were much larger than those between the next 40 years (1980 and 2020)I think a simple model could explain that. Let’s say computing went from ‘1’ to ‘41’ in those first forty years, an improvement of about a fac