Logo 
Search:

Unix / Linux / Ubuntu Forum

Ask Question   UnAnswered
Home » Forum » Unix / Linux / Ubuntu       RSS Feeds

OS size

  Date: Jan 21    Category: Unix / Linux / Ubuntu    Views: 398
  

I thought I should make a new thread off this comment as the comment sparked a
very old nagging question in my mind... Why are OSs so big? Is most all that
just drivers for every device under the sun?

An OS just talks to the devices right, plus java, flash, and media players?
MSDos talked to almost all the same devices with whole systems that were less
that 10 Mbytes total. Now you can't operate with less than a Meg of ram and 4Gig
of storage.

Why are these OSs so bloated? Old Forths ran in 16K and still can drive the same
machines, less java, flash, and media players but I bet someone who knew what
they were doing could do all that in Forth for another 16K if drivers were
standardized per category of device.

Share: 

 

12 Answers Found

 
Answer #1    Answered On: Jan 21    

It driven by consumer needs Michael, Now people are used to roaming the
world on their computers and the younger, well and some oldies, want to
play the video content of web sites, web builders use more and more
powerful so the builder use Adobe, Java, Flash and media players of all
sorts,,OSs have to then provided the platform and programs for us to run
these things.

Take the introduction of the BBC iPlayer. So many people had plagued the
BBC for past program viewing on their web site that they first
introduced streaming clips but people wanted who programs and even
program series to view because they had missed their favorite viewing,
so the BBC decided to develop a system that would do the lot.

... the Trust noted the strong public demand for the service to be
available on a variety of operating systems. The BBC Trust made it a
condition of approval for the BBC's on-demand services that the iPlayer
is available to users of a range of operating systems, and has given a
commitment that it will ensure that the BBC meets this demand as soon as
possible. They will measure the BBC's progress on this every six months
and publish the findings.

Even the browser builders had to try to keep up so the BBC promised to
help them.

 
Answer #2    Answered On: Jan 21    

The same thing applies to computing as to anything else. Nobody want to
drive an old car or watch a small B & W television. People want something
that is current.

Everything on your computer is bigger than it once was. My first computer
had 64K with a monochrome moniter and a 5 /4 inch floppy disk that stored
128K. Compare that with today's cell phones. When you ask something to do
more then it requires more.

You assume big means bloated and it can, but it does not necessarily follow
that this has to be the case. You can still write tight code and Linux has a
history of doing this. The kernel has remained much the same in recent
years. You can still run Linux from foppy disks. What has changed is the
support modules. There are more modules because more is happening. People
have more drives, more devices, more of just about everything.

If small is your thing then you can still do it by compiling just what you
need and running a stripped down OS. You should look at Arch or LFS. Don't
assume that Ubuntu is all that Linux can do. It can be small, but most
people choose not to run a bare bones system.

 
Answer #3    Answered On: Jan 21    

There's also, to my mind at least, a small confusion here. Linux itself
is tiny - there's a (fairly old, but still viable) build-it-yourself
distro out there that compiles a bootable Linux image onto a pair of
720K floppies - what causes a lot of the apparent bloat is the fact
that, with a modern Linux install, you not only get the OS itself but
also all the applications software that allows you to actually *do*
something with the computer.

Strip out OpenOffice, F-Spot and all the rest of it and what is left is
surprisingly little.

 
Answer #4    Answered On: Jan 21    

Then things are as they should be on this score. I
doubt I'll be compiling as that just looks like a crap shoot to me. I love the
idea that you can run Linux from a floppy, that is nice to know.

It sure would be nice to have a Kdenlive CD with just the right stuff added to
it, and be able to install it COMPLETELY from just a CD or USB. But at this
point I just don't have the time to learn how that could be done.

 
Answer #5    Answered On: Jan 21    

You can play around with Arch, LFS or Gentoo in a VM or you can install it
to usb. That way you can take as long as you want to learn. The problem that
you are going to have is that you need an internet connection for most
distributions because of how it is distributed.

There are alternatives to KDEnLive. There is PiTiVi, Open Movie Editor, and
Cinelerra (in order from easiest to hardest). You could install a basic
distro such as Lubuntu which is lightweight and install your movie editor.
Remove what you don't want and use Reconstructor to build a Live CD or ISO
which you could install on a usb stick. Make sure you have a fast one and
one large enough to fit your temp files. It really is not hard to do.

The nice thing about Linux is you can do pretty much what you want and most
people do manage to use it in diverse and interesting ways.

 
Answer #6    Answered On: Jan 21    

I appreciate the brief customizing Linux tutorial but I'll be lucky to get
enough time on a connected system that will allow me even to just download a new
Kdenlive to a bigger USB so I can burn a CD or DVD from KDenlive.

I still don't know how big to make the root partition bigger on say a 16Gig USB
device or where to change that parameter.

Also, I've ordered a USB case for my hard drives so maybe someday I can just
install to the USB HD from a borrowed Internet connection that will allow it.

 
Answer #7    Answered On: Jan 21    

Yes, the HD case sounds like a good idea. USB keys are useful, have
limitations when it comes to large files. You can resize your partitions
from Gparted or Partition Editor, as long as it is not mounted. For that you
need to be working from another file system or the Live CD. Windows will
even work provided you have the right software. I like Parted Magic Live CD
which I always keep a copy around. You never know when you will need it. I
use rewritables so that I can upgrade the version as needed.

It is probably easier just to delete the existing partitions and start over
without any partitioning. It is the best of limited space.

 
Answer #8    Answered On: Jan 21    

Got the USB HD case and a new USB Wifi Modem with antenna.

I was able to repartition a small IDE hard drive using Gparted from the USB but
it still won't let me write a file on the drive even from the Kdenlive USB that
I used to repartition. It says I don't have permission to write the file. Any
ideas?

I was able to boot Linux from a previously setup Ubuntu HD over the USB case
line. That was cool. Now with the Kdenlive USB, the IDE HD and the USB HD I can
boot three different systems. Had to chuck the aluminum case though because the
drive got very hot.

The USB HD is erratic though. Last few timse it wouldn't boot.

Also, my wifi reception was definitely boosted by the new modem and antenna
enough but I keep getting knocked offline immediately. So, I'm pretty sure now
it is either an authorization issue or a denial of service attack. NSA maybe?

I don't know why I would be able to access google and then get bumped if it was
just an authorization issue. NYCWireless is a public access system so there is
no authorization needed normally, unless they have changed that since I was last
on a year or so ago.

 
Answer #9    Answered On: Jan 21    

What started OS's expanding was the graphical user interface and ever
since then the more graphical abilities that were added increased the
overall size of the OS. Sure we could still be using DOS / Forth etc
and command line to do everything but that kept the use of computers
in the hands of those that could understand how to use them. The GUI
released the world of computing to the great unwashed and so heralded
the world of computers we see now. Without that, computers would still
be largely an elitist occupation with prices of systems still making
them out of reach for most folks. It's also debatable whether 'the
internet' as we have now would ever have evolved.

I agree that sometimes the pursuit of graphical nirvana has led to
code bloat on a massive scale and a slackness in coding has been
allowed to develop as restrictions of RAM / CPU horsepower were
lifted, leading to code that is larger that it could optimally be
written. There's a saying that any fool can make things complicated
and it takes a wise man to make them simple - a quick look at remote
controls for TV / Audio etc will bear that one out !!

Personally, I give thanks for the GUI / multi-tasking / drag and drop
world we live in

 
Answer #10    Answered On: Jan 21    

Agreed. Linux is one of the few OSes that allow you to work at different
levels. You can work in a terminal and surf raw and use EMACS if that is
your thing or you can use a fully graphical desktop with all of the bells
and whistles. Different strrokes for different folks.

Anybody who wants to run lean should consider using something like Arch or
Gentoo. Ubuntu us great, but its apprach is to hit all of the right notes
with the most people. That measns that some people have more than they need
and others less. Arch gives you a bare bones setup and you build it as you
want. You could also look at Debian and its net install if you still want to
go graphical much of the way.

 
Answer #11    Answered On: Jan 21    

Minix and other small OSs don't seem to need so much to do GUI. Granted they are
crude and buggy but still... But I hear what you are saying.

It would be fun to see a breakdown of what is where, a pie chart of code
breakdowns, ie. X bytes screen interface, y bytes, USB driver, etc.

 
Answer #12    Answered On: Jan 21    

In the days of Dos, user interfaces were supplied by the running
program. This job is now part of the OS.
Much of the GUI are pictures stored on your hard-drive. Pictures take up
a bunch of space. Many of the routines are built into the API that had
to be built for every program in Dos days. Just the books and references
for the API and how to hook into them are huge. Mouse handling routines
are bigger than you would expect, remember that no mouse was used on the
Old Dos machines, and resolution stunk. When there was mouse functioning
it was program by program but would not work on all programs because the
handling was done with the programs routines not the OS as it is done
today. Dos was single tasking; one program would run at a time and
never more than one program. Loop routines are built into a
multi-tasking OS to alot time for everything running. Analogous to
interrupts, flags are set by programs to demand processor time. Not an
issue with Dos a single processing OS.

Do you remember how many years you found machines OS/2 ready before OS/2
was released? Multi-user, Multi-Processing OS's are a huge
accomplishment, that said they also run slower because the processor
time is sliced between running processes, and require more resources.
Disk, Memory, and processor, that make the system look like it is doing
more than one thing at a time is the major reason for more power and size.

In-line code makes a process run faster, but unlike the system calls
that are used for modular code, the in-line code is repeated many times.
This fact makes faster code take more space in memory as well as disk
space. (Don't forget GUI is both memory hog, and processor hog, not a
DOS issue).

Summary;

Optimized code=fast running code with a larger demand on disk space and
memory.
Modular code= slower running with less need for large hard drives and
lower memory requirements, but require faster processors.

It is all compromise, and in this day of low price hardware, it is easy
to understand the larger hardware requirements over the hardware
optimization of the past. In dedicated controllers optimization of
hardware is still an issue so we find more compact code even today.

With this information we can begin to see why Linux geeks still consider
compiling an OS from code rather than a simple install that most of us
prefer.

 
Didn't find what you were looking for? Find more on OS size Or get search suggestion and latest updates.




Tagged: