Thirty Days of Chaos?

Next Tuesday sets off a month (well, 39 days) of what could be the biggest change in tech since Apple launched the ARM transition (for Mac and Windows) in 2020.

Hyperbole, yes. But think about it. From the May 7 Apple iPad event to Build to WWDC, the tech industry will start the real march to the AI drum, backed by what looks like some impressive new hardware to support it. The real question is whether the AI hype will pan out or the bubble will burst like the 2008 mortgage crash. In fact, I fear there could be more to that comparison than meets the eye, as the real estate market wiped out 20 plus years of retirement savings in an instant.

Is AI for real - in other words is it basically a new computing paradigm, or is it just the competent Google/Bing/Siri search that we’ve wanted for years - the relational database for the general user?

It’s also going to be fascinating how things shake up between the soon to be released iPad Pros vs Surface Pro 10 WOA. Apple has always walked the picket fence between “tablet only” and “what is a computer” but I feel that is about to go from a fence to a razor blade for them. They either need to go toe to toe with Surface Pro or give up the ghost.

For M$, it is obviously AI uber alles, and will they be able to force feed Copilot to the market, or will it truly become just the improved Bing that they never succeeded with? It could be the make or break moment for the Surface line as well.

What do you think tribe?

4 Likes

For a long while now, I’ve argued that what is really needed is some general-purpose graphical programming environment which can be used to make graphical applications, and that one of the things holding this back is the lack of a general-purpose database built into operating systems.

The Newton was so close with its concepts of “Soups” and I’d give a lot to be able to identify and tag information and persistently store it.

The problem is, any sort of persistent interaction which involves files is a huge potential security hole.

Maybe one way to sandbox things is to only allow some graphically-expressed programming subset? An AI-enhanced Automator.app would be something which could actually get some real work done.

The Mac OS, w/ its robust underlying security model is certainly poised much better for this sort of thing than Windows.

I do think a large scale easy access personal database would be a welcome improvement to personal computing. Really it makes more sense in the personal assistant category that Alexa and Google Home inhabit. But I agree with @WillAdams from a security standpoint, that we either just give up all our information to our AI overlords, or we have to significantly restructure how data is accessed.

But, in the long term, I could imagine a personal AI that could not only access all our personal data how we want it, it could build interfaces for us to interact with it how we want to. It could look something like “Hey Google, create a mock up app for my phone that does xyz, accessing specified data from my personal vault and send it to my phone to review.” And off we go creating our own personalized app to access data how we want. Conversationally, we could just go back and forth with our AI voice assistant until we got what we wanted and say “ok, now build it and install it on my network of devices.” I think the future very likely looks like AI voice assistants replacing 90% of the apps we use, and integrating seamlessly with the other 10% that use shared data, like social media apps.

But, in the short term, I think we’ll have a lot of bumps along the way. I don’t particularly think the next month.5 will be the most chaotic, but it will probably lay the groundwork for whatever direction we’ll eventually end up with for local LLM AI.

What I want to see is an LLM on device that can be focused to data on your device only for answers - AND IT NEVER PHONES HOME!

4 Likes

I highly doubt our gracious overlords will allow all that juicy data to stay local. It’s too valuable.

1 Like

You know, maybe I’ve been thinking this bass-ackwards. Maybe my MacBook should be fully disconnected from the internet, and my iPad Pro should be the sole portal for search, email, messaging, etc. to the dirty outside world. I guess they could still steal my emails, but even that could be partially blocked in an Outlook setup. Hmmmmmm?

Would it be possible to install a a private LLM on an unconnected device to just work on my data? Still, updates to the MacBook would be a problem even in that scenario…PARANOID MUCH?

1 Like

My firm’s accounting and billing software sits resient on a Desktop PC running XP with no internet or local network connection. If you’re paranoid, you are not alone.

5 Likes

My Psych 101 prof always said “It’s not paranoia if they’re coming to get ya!”

3 Likes

I fall on the opposite side of that fence. Even if one time you happen to be right, it was still paranoia.

3 Likes

I’d say it’s for real, at least at my present company there are too many people (machine learning experts who are pivoting to LLMs) working on it, where I think some people/groups will stumble upon a useful application for internal efficiency and cost savings.

Outside of corporate internal usage, I feel it will have an inevitable impact to the everyday person, but it won’t be in obvious ways (i.e. built into Google searches, finance, health…)

Yes… sorta kinda… you’d take an existing open source LLM and “fine tune” it to incorporate your own data.

I do this in a corporate environment already with our IP, but I don’t think it’s quite DIY because:

  1. you’d probably need a GPU cluster for the requisite horsepower and VRAM to fine tune, depending on the volume of data you have (although if you already have a fine tuned LLM, you conceivably could run the predictive/generative part on a gaming PC)

  2. it would be helpful to have machine learning experience, as the fine tuning work requires a training/iteration process, that I think requires some skills in data processing, database programming, natural language processing scripting, analytics prowess, etc

Perhaps over time, these processes can be automated as the models mature… although currently there isn’t a true understanding at how neural networks manage to do the predictive language generation with such plausible accuracy, so that adds to the difficulty of improving the frameworks for general public usage IMO.

3 Likes

1 Like

Wow-how did I miss this nugget the first time - that alone is scary

1 Like

Most exciting aspect of the 30 days of chaos: hifi will go from @Hifihedgehog to Hifihedgehog - Playing with Surface Pro 10.

5 Likes

OK - with only 11 days to go, where are all the juicy Surface Pro 10 rumors, featurettes, lies…

4 Likes

What lies? :crazy_face:

The image is an animated GIF of Gollum from "The Lord of the Rings" film series, with text overlay that reads "Smeagol lied." (Captioned by AI)

2 Likes