mastodontech.de ist einer von vielen unabhängigen Mastodon-Servern, mit dem du dich im Fediverse beteiligen kannst.
Offen für alle (über 16) und bereitgestellt von Markus'Blog

Serverstatistik:

1,4 Tsd.
aktive Profile

#technological

0 Beiträge0 Beteiligte0 Beiträge heute

True

'One of the great ironies of history is that the triumph of #Maga has led to the piecemeal destruction of everything that once made #America great, and on every level.

Its power derived from a reliable #trade network, with logistical chains that were the wonders of the world, combined with a huge alliance #network, and the greatest #scientific and #technological institutes in the world.'

theguardian.com/us-news/commen

The Guardian · This Fourth of July, the world declares its independence from AmericaVon Stephen Marche

#geopolitics #eu #economics FYI I follow this stuff. Whilst the #uk relies on the #us at least the #eu is able to take awider view. However I'm not unaware of #chinese internal #politics #ethics and #standards on #human #rights . As always "sup with the devil, best sup with a long spoon" or just give up our addiction to being first with #technological change? A conundrum wrapped in a paradox.
theguardian.com/business/2025/

The Guardian · EU may as well be ‘province of China’ due to reliance on imports, says industrialistVon Lisa O'Carroll

People continue to think about #AI in terms of #2010s computing, which is part of the reason everyone gets it wrong whether they're #antiAI or #tech bros.

Look, we had 8GB of #ram as the standard for a decade. The standard was set in 2014, and in 2015 #AlphaGo beat a human at #Go.

Why? Because, #hardware lags #software - in #economic terms: supply follows demand, but demand can not create its own supply.

It takes 3 years for a new chip to go through the #technological readiness levels and be released.

It takes 5 years for a new #chip architecture. E.g. the #Zen architecture was conceived in 2012, and released in 2017.

It takes 10 years for a new type of technology, like a #GPU.

Now, AlphaGo needed a lot of RAM, so how did it stagnate for a decade after doubling every two years before that?

In 2007 the #Iphone was released. #Computers were all becoming smaller, #energy #efficiency was becoming paramount, and everything was moving to the #cloud.

In 2017, most people used their computer for a few applications and a web browser. But also in 2017, companies were starting to build #technology for AI, as it was becoming increasingly important.

Five years after that, we're in the #pandemic lockdowns, and people are buying more powerful computers, we have #LLM, and companies are beginning to jack up the const of cloud services.

#Apple releases chips with large amounts of unified #memory, #ChatGPT starts to break the internet, and in 2025, GPU growth continues to outpace CPU growth, and in 2025 you have a competitor to Apple's unified memory.

The era of cloud computing and surfing the #web is dead.

The hype of multi-trillion parameter #LLMs making #AGI is a fantasy. There isn't enough power to do that, there aren't enough chips, it's already too expensive.

What _is_ coming is AI tech performing well and running locally without the cloud. AI Tech is _not_ just chatbots and #aiart. It's going to change what you can do with your #computer.