Some like Microsoft (Windows). Others prefer Apple (Unix). Ever try Linux (Ubuntu, Mint, or ... ) ?

sear

Administrator
Staff member

Ubuntu 22.10: What’s New? [Video]​

By Joey Sneddon

All told, Ubuntu 22.10 is a fairly modest update to Ubuntu 22.04 LTS. The inclusion of GNOME 43 and the switch to PipeWire as the underlying audio stack are both significant changes, but there’s little else to lure long-term support users away from the luxurious bosom (oo-er) of stability.

The computer countenance, the metaphorical face of the machine that confronts computer uses each day is probably application software. BUT !!
Underlying that there's additional computer software called the "operating system" (OS). It's the OS that enables the hardware to run your applications software.
Are you happy with your OS?

This graphics or "gaming" computer I Internet with is the platform for my still images, and videos. A month ago my still image software, an app within Win10 stopped working properly. I researched repairing it, but found it much easier to simply replace Microsoft's headache with a separate image viewer software package.
That's an OS problem. Since we're now at Microsoft Windows 11, shouldn't Microsoft have worked the bugs out of displaying still images on the screen by now?

Tired of Microsoft? Ever try Unix (Apple), or Linux? Got something even better?
 
All I ask is that when I turn it on it works after that everything is negotiable

My Cell phone runs on Android and Im very happy with that also
 
I got my first PC in 1981. It was a super-ship. The standard back then was 64K of RAM. Mine has 128K, and two sloppy disc drives.
It came with a DOS manual, a hundred pages? But read the manual, and you're an expert.

My telephone is the proper kind, on the wall by my desk.
I own some smart phones that run Android. I have the Thousand KB pdf, but I've been slacking. I wouldn't use it as a phone. But it could be a dandy pocket computer for those rare occasions I'd want one.

I've tried Ubuntu on a former Windows computer. I was fiddling with an additional monitor, and just plugged it in to see if the connection cable I had was compatible. It automatically configured the 2nd monitor, and set it to "extended" mode, two separate screens, not two identical displays of the same thing. iirc it's more complicated with Windows.

My problem with Linux has been configuring peripherals such as printers. That used to be easy with Windows. But I recently had such a problem configuring my perfectly fine printer to Win10 I gave up, and bought a new printer. The new one was a major headache too. Makes me long for my trusty dot matrix.
 
my first was a "Dragon 32" and ran on audio tapes closely fallowed by an Acorn Electron and a Commodore 64 all were equally useless.

My first "real" computer was a home build (I didnt build it) if memory serves it had two hard drives 1x 1Gb and 1x2Gb it would do the internet on dial up but was slow and the internet was expensive (no google no youtube)
 
I consider the Commodore 64 (there's that 64K of RAM again) PC evolution royalty. Never had one. But I remember someone showing me how their audio cassette storage system worked. What I noticed, back then I was using audio cassettes, which had ~5 second leaders on them. The computer cassettes didn't. Mag. tape spool to spool.
Computer technology evolves quick. REAL quick !

Intel Core i9 13900K and Core i5 13600K review: an effective redoubt against AMD's Ryzen 7000 advances

Eurogamer.net

Intel i9-13900K and i5-13600K review: Beating AMD at its own game

Ars Technica
 
PC Mag has been a go-to resource for home computer users since before the Internet. Their comment on Windows 11 is -

10 Big Reasons Not to Upgrade to Windows 11​

If you don't want to buy a new computer or give up an efficient interface that you love, then don't upgrade to Windows 11. Windows 10 does everything just as well, and in some cases even better.

1. You Might Have to Buy a New PC for Windows 11​

2. The Windows 10 Taskbar Is Better​

3. Windows 10's Start Menu Is Better​

 
The older version of Mac OS X work the best, but you can make a 'Hackintosh' on some PC models nowdays anyway, so don't have to pay the high price for expensive Apple hardware plus they offer all the old OSes for free on the Apple website going back to 10.11 Of course, for compatibility you want Windows.
 
t #6
Way to plant the hatchet in Microsoft's forehead!
I recently "upgraded" to Win10 from Win7. There's a Win7 overlay that makes Win10 run more like a Win7 OS. And I downloaded the solitaire games, so I don't notice much difference. One of the biggest changes, Win7 "Snipping Tool" couldn't handle full UHD / 4K screen captures. Win10 snip can, & does.

BR #7
I tried Apple for a few years. But their 3rd party software partners were crooks, and Apple was little better.
I agree that Windows seems to be the default industry standard OS at least for home computing. My objection is Microsoft is gratuitously sadistic about it, making entirely gratuitous, needless changes with each new Win version, and then pulling the rug out from under users of their previous versions by withdrawing support.

I upgraded to Win10 because Win7 didn't support 64 Gigs of RAM, and probably not the i9 processor either. But I despise the Microsoft cyclical torture test inflicted on its "customers", more accurately, "victims". That would be my reason for switching to Linux permanently.
 
It depends on what you use surely, but yeah a lot of 3trd party software makers for OSX do charge high prices and break compatibility wbetween upgrades of OS and program, which is a nightmare and has cost people many thousands of pounds/dollars/etc over the years.
 
One of the most fabulous American comedians alive today, Steve Martin, played the title character in the movie The Jerk. Therein, Martin is learning how to work at a carnival midway when he has a forehead-slapping realization: "It's a PROFIT DEAL !!"
Not too long ago Apple was listed as the wealthiest commercial entity on Earth. But it's surpassed gratuitous overkill. The E.U. requiring Apple to use the standard USB 3.0 port instead of a proprietary Apple charging port is but one of countless examples.

Perhaps naïvely, I think of Linux as a little more pure than Windows, or Unix.

And while I may simply be outright wrong about it, I sincerely believe, based upon such famous marketing successes as the VW Beetle, that if Microsoft gave consumers the choice:

- an up-to-date OS that works much like the Microsoft OS they're accustomed to, or

- the latest Frankenstein Redmond can dream up, as they've been doing since Windows 1.0

I sincerely believe the former would be at very least a commercial success, a $benefit to Microsoft's $bottom $line.

In ignorance, I suspect & hope that's what at least some Linux distros. already do. If not this is a gargantuan opportunity.
 
Indeed.

That's what Ubuntu was SUPPOSED to be - Linux for the masses right? But that didn't quite materialise, it's not bad and it's a good attempt but just not there is it. OSX just nicked BSD code and made Darwin (the OSX kernel - open sourced under XNU license) using it (mind you so did the Windows NT Kernel [as in using BSD code] - still in partial use today in Win11) and slapped non-FOSS GUI [OSX] and other elements on top of Darwin.

They should start from the bottom up and prob should have nicked MORE BSD code given the permissive BSD licensing terms - can't remember what it is but it's a far cry from the GPL - and make Windows Great Again! (yeh I know).

Would be nice to see a BSD based attempt at Ubuntu or something wouldn't it? Gosh, this struggle has been going on with consumer-usable versions of Linux since the 90s hasn't it, nothing ever changes does it.

In the old days, you had more competition at least wit hte likes of SGI's Iris OS and computers and OS/2 and other hardware - nto to mention Sun Microsystems (now bought up by Oracle? is that right?) and their Sun OS and (now open sourced) SPARC processors etc. Prior to that, you had even more competition the further you go back into the days of 'homebrew computing' (before I was born :) ).
 
Last edited:
BTW, it's none of my business, but what 'still image software' were you using on Windows? And what other ghraphics software? Just interested.
 
Indeed.

That's what Ubuntu was SUPPOSED to be - Linux for the masses right? But that didn't quite materialise, it's not bad and it's a good attempt but just not there is it. OSX just nicked BSD code and made Darwin (the OSX kernel - open sourced under XNU license) using it (mind you so did the Windows NT Kernel [as in using BSD code] - still in partial use today in Win11) and slapped non-FOSS GUI [OSX] and other elements on top of Darwin.

They should start from the bottom up and prob should have nicked MORE BSD code given the permissive BSD licensing terms - can't remember what it is but it's a far cry from the GPL - and make Windows Great Again! (yeh I know).

Would be nice to see a BSD based attempt at Ubuntu or something wouldn't it? Gosh, this struggle has been going on with consumer-usable versions of Linux since the 90s hasn't it, nothing ever changes does it.

In the old days, you had more competition at least wit hte likes of SGI's Iris OS and computers and OS/2 and other hardware - nto to mention Sun Microsystems (now bought up by Oracle? is that right?) and their Sun OS and (now open sourced) SPARC processors etc. Prior to that, you had even more competition the further you go back into the days of 'homebrew computing' (before I was born :) ).
Not trying to blow you off here BR #11, but at the level of computing your #11 addresses I'm barely ~20% computer literate. I recognize the OS/2 reference. - snicker snicker -

iirc IBM faltered with it, and tried to elbow itself some status by boasting OS/2 was a better DOS than DOS, and a better Windows than Windows. I tried it at that point, and it wouldn't boot my computer at all.

My opinion, the hardware guys are ahead of the software guys so far. BUT !!

I gather the hardware guys are careening toward an immovable brick wall, quantum computing. Right now, execute a command and a stampede of electrons gallops down the wire to make it happen. But we've been thinning the herd for generations. Eventually we'll winnow it down to one electron, the quantum. That's the limit as far as I know. Can't imagine how they'd send half an electron down the wire. BUT !!

The sky's the limit for the software guys, & they've got plenty of catching up to do.

- ps pending -
 
BTW, it's none of my business, but what 'still image software' were you using on Windows? And what other ghraphics software? Just interested.
I don't know the name. But as far back as I can remember (Win3.0?) there was a native OS graphics viewer. If not then, as soon as digital photography provided something to view.

BUT !!

On my Win10 machine the native MS Win10 graphics software pooped out. I Googled for a solution, but it was too complicated. So I abandoned the broken native MS image view sw and started using ImageGlass. Not wild about it. But at least it works.

Similar story with my Sony cam. It shoots vids @UHD / 4K resolution. In my experience (until then) all I had to do was load the vids or stills to the computer hard drive (HDD) click it, and the computer OS would do the rest. But not with the Sony. I was so perplexed I very nearly threw the camera away. Instead it collected dust for years until I learned two things.

1) I could configure the Sony to use a file format compatible with my computer. &

2) There was a free video viewer named VLC that works well.

My first flat panel display was a small 720HD. I've upgraded since then a few times, and now have two 43" UHD / 4K displays over my desk. Excellent! Seems to me it's getting close to not being able to distinguish between a computer display, and looking out the window. Matter of fact, if I build a house, I may skip most windows, and mount flat panel displays instead. It'll consume electricity. BUT !! Because I'm near the Canadian border Winters get chilly here. I'm guessing I'd save more energy going semi-windowless. AND I could mount a joy stick at each monitor, so I could PTZ.

In fact, when I want a closer look, I keep the magnifying glass on my desk, and shoot some nice 4K images. Then I display them on the screen, about as much detail as I've needed, so far.

In the past week or two I learned my computer has video capture software, so that:

If during a 1 hour recording there's a 3 minute segment I want to save, I can play that segment, record it, essentially editing out the dross. The stumbling block I've hit now, I tried to attach to an e-mail one of these 3 minutes segments to sent to a friend, but gmail says it's too big. So I need a way to send such vids.

I've got 3 minutes on how to put together a Spanish Windlass in a hurry, got a friend that might enjoy it.
 
I'd really love to see 4k images you've taken of the Adirondacks. :)


I've taken a few of where I live here, but you can get a good idea of what it's like round the country parks in my area by looking at flickr (they were much better than me - my Canon Powershot 710HXS gathers dust as we speak..).

My computer is so slow and irritating that it can't even display 4K @ 60FPS stuff on youtube, nor can it do so in VLC.
 
I've tried a few times to post a nice image, but for the first time @CV the system is preventing the images from displaying. It shows a tiny image:
450351034111601b13de9b0f7bd36ff1ab17c47.gif


I'll have to have a chat with my webmaster. I might have maxed out the storage capacity allocation we started with.

Seems like it's time for a new computer for you. This may be a good time of year. Not sure. In the U.S. there are often big sales around Christmas. Not sure how it works there.

btw
Not sure, but we might have hit a plateau @ UHD / 4K.
I gather there's higher resolutions reaching the market. But to my knowledge 1080HD is about the highest res. for over the air (OTA) broadcasts around here. Point being, when they're constantly upgrading, difficult to know where to hop on the upgrade merry-go-'round. December may be the time.
 
My borrowed computer should be alright, it's an Aspire ES 15 (i5-4210U turbo up to 2.7ghz) so should be OK, it's Hackintosh-able but I need a USB stick. It's the intel GMA graphics which suck and make graphics suck. But then the Mac I was using before was lesser spec'd and could run 4K videos at 60FPS just about..? Is it just WIndows being badly coded? I suspect so.
My Mac which I was using needs a new power adapter (one of the old pre USB C ones sadly so more expensive), but once that's working it should be OK. My 2013 Mac has worked pretty much alright for years. If I could afford an upgrade (I can't) I'd probably get a very easily Hackintosh-able PC for about £750ish, hoping for a quad i7 from the 2017/18 of Intels for that- think that's about right for the price range -ish. It should run OSX 10.11 or 12 which was still OK in terms of efficiency compared to their latest awful offerings.
 
Last edited:
I bet you could find more up-to-date hardware on a rubbish pile. The trick of course is intercepting it before it's returned to the bio-assimilation pile.

I have noticed that with equipment that's marginal for a graphics task, some graphics software is more efficient than others. But within the budget, might make sense to try a few graphics options.

I didn't find VLC. I was at the public library one day, and a stranger caught me with a cheap notebook computer under my arm. He insisted that I download VLC, and he mentioned some of the fabulous things it could do.
It stayed on that notebook for years, until it suddenly occurred to me one day, VLC might play back the Sony handycam vids I'd shot.

Yup.

And I've been using that camera ever since. I even figured out how to get 20 megapixel mechanically stabilized, digitally stabilized still images. Bummer I can't post any here. If you like we might open a thread for it @The Round Table. I'll see if that'll work. ...
 
Back
Top