imgseek - must try this; looks a bit like iPhoto for Linux?
Friday, January 31, 2003
Friday, January 24, 2003
SCOoffice mail server - Exchange Server for Linux, basically.
Thursday, January 23, 2003
Not directly Linux, but I plan to learn some Perl at some point later this year, and will probably do so on the Linux platform. This is an incredibly useful tutorial page for programmers.
Wednesday, January 22, 2003
A while back, I said I'd do an overview of compiling apps from tar.gz files or whatever. I'd better do that before I forget, as I'm bound to want to read my own notes when I next have a need to do it in 6 months :)
It's actually not particularly important whether the install files are .tar or .tar.gz. An overview of these is:
.tar is a tar file. It's like a package of multiple files all "wrapped up". To extract all the contents of a tar file (to untar the file), use the tar command. I usually try
tar -xvf (filename)
, where (filename) is the name of the tar file. Note that -x is extract, -v is verbose and -f is file, i.e. the file that you want to untar. The man pages will give you more. Anyway, if you run this from the directory that the tar file is in, it will create a new subdirectory there containing the extracted contents of the tar file.
.gz is a gzip file. This is compression, basically. To uncompress (unzip) the file, use the gunzip command. I usually try
gunzip (filename)
, which is pretty straightforward! If you run this from the directory that the .gz file is in, it will extract the contents of the gzip file there.
What often happens is that a .tar file is compressed using gzip. The net effect is that multiple files are packaged up and compressed, just like the effect that WinZip has in windows. This results in a .tar.gz file.
I prefer to extract the compressed files in two stages, using gunzip and then tar as described. Note that these file operations append a filename suffix as they happen, so .tar.gz has been through tar then gzip. To extract the contents you must reverse the order, so gunzip then (un)tar. It's pretty logical really :)
You might prefer to look at the -z option for the tar command, which in some circumstances allows you to unzip and untar a .tar.gz file in one command, e.g.
tar -xzvf
Trouble is, I haven't fully got to grips with that yet, and it therefore doesn't always work for me. I'll get round to experimenting with it one day.....
Once you have the extracted files in a directory, compiling the application is pretty easy. I always do all of this as root, but I think you only need to actually do the final step as root.
Anyway, first, change directory to the root directory of the extracted files. This will contain an executable file named configure. Run configure using
./configure
Your computer will now get busy for a bit - be patient; it's testing your system to see if the installation can happen. If you are missing some essential components that the application needs to install itself, it will fail and tell you what you need. One prime example is the gcc (Gnu C Compiler) - you need this (or another C compiler) to compile many applications.
Assuming configure finishes OK, you next need to run make - just type make and hit return. This constructs the makefile, which describes the target locations for installation components and also the locations of key components on your particular system/distro so that the install process can create the necessary links to them. Again, be patient, this can take a while on a slow system.
Finally, if make ran OK (i.e. you got no errors), your application is now compiled for your system, so perform the install. To do this just type
make install
Once again, your computer will get busy, but it is now installing the application according to the system-specific makefile you made earlier. If all is well, your application will be installed and ready to use when this process finishes.
You can then also get rid of any files that you used for the install, like the contents of the tar file that you extracted.
Before you do so, it may be worth scanning through the lines of crap that have just scrolled up your screen, as many programs create an uninstall file and/or directory somewhere - you might want to make a note of this, and ensure you don't delete it!
It's actually not particularly important whether the install files are .tar or .tar.gz. An overview of these is:
.tar is a tar file. It's like a package of multiple files all "wrapped up". To extract all the contents of a tar file (to untar the file), use the tar command. I usually try
tar -xvf (filename)
, where (filename) is the name of the tar file. Note that -x is extract, -v is verbose and -f is file, i.e. the file that you want to untar. The man pages will give you more. Anyway, if you run this from the directory that the tar file is in, it will create a new subdirectory there containing the extracted contents of the tar file.
.gz is a gzip file. This is compression, basically. To uncompress (unzip) the file, use the gunzip command. I usually try
gunzip (filename)
, which is pretty straightforward! If you run this from the directory that the .gz file is in, it will extract the contents of the gzip file there.
What often happens is that a .tar file is compressed using gzip. The net effect is that multiple files are packaged up and compressed, just like the effect that WinZip has in windows. This results in a .tar.gz file.
I prefer to extract the compressed files in two stages, using gunzip and then tar as described. Note that these file operations append a filename suffix as they happen, so .tar.gz has been through tar then gzip. To extract the contents you must reverse the order, so gunzip then (un)tar. It's pretty logical really :)
You might prefer to look at the -z option for the tar command, which in some circumstances allows you to unzip and untar a .tar.gz file in one command, e.g.
tar -xzvf
Trouble is, I haven't fully got to grips with that yet, and it therefore doesn't always work for me. I'll get round to experimenting with it one day.....
Once you have the extracted files in a directory, compiling the application is pretty easy. I always do all of this as root, but I think you only need to actually do the final step as root.
Anyway, first, change directory to the root directory of the extracted files. This will contain an executable file named configure. Run configure using
./configure
Your computer will now get busy for a bit - be patient; it's testing your system to see if the installation can happen. If you are missing some essential components that the application needs to install itself, it will fail and tell you what you need. One prime example is the gcc (Gnu C Compiler) - you need this (or another C compiler) to compile many applications.
Assuming configure finishes OK, you next need to run make - just type make and hit return. This constructs the makefile, which describes the target locations for installation components and also the locations of key components on your particular system/distro so that the install process can create the necessary links to them. Again, be patient, this can take a while on a slow system.
Finally, if make ran OK (i.e. you got no errors), your application is now compiled for your system, so perform the install. To do this just type
make install
Once again, your computer will get busy, but it is now installing the application according to the system-specific makefile you made earlier. If all is well, your application will be installed and ready to use when this process finishes.
You can then also get rid of any files that you used for the install, like the contents of the tar file that you extracted.
Before you do so, it may be worth scanning through the lines of crap that have just scrolled up your screen, as many programs create an uninstall file and/or directory somewhere - you might want to make a note of this, and ensure you don't delete it!
Sunday, January 12, 2003
So I did a rebuild to try and organise disk space a bit better. First I tried RedHat 7.2 again, and got the install down to about 750MB. I then ran up2date to get the latest versions of all packages, and after about 36hours of downloading and installing, the install size was back to 1GB. Arse.
Next I tried Red Hat 8.0. I managed to deselect everything possible and get the install size down to about 800MB, but this is already latest versions, so that's OK. Problem is, X didn't work at all, neither did PCMCIA. I want this system running well by next weekend, so I went back to 7.2 and didn't run any up2date. I now have Linux, Gnome and Window Maker available for 750MB.
That still seems a lot to me, so I've been looking for smaller distros. The problem is, I need something for a newbie like me. I don't want to customise a slackware install and then slowly recompile the kernel and add packages to give me tailored minimal functionality. It's not hard to understand - I just want to know why there isn't a "minimal graphical" install available for a main distro? All I need are core components, networking, PCMCIA, X and Window Maker, and I'm happy. I just don't want to use more than, say, 500MB of disk.
So I found RULE, which is a project to allow people with crappy hardware to run Up-to-Date Linux (Red Hat only at the moment, but more on the way). This may be the answer I've been looking for.....
As I said, I need the craptop to be running decently next week, as I plan to use it for a fortnight as a network security tool (nessus and nmap are essential) and also to maybe dabble with shell programming basics. After that, RULE might be given a try..... I'll post here if and when that happens :)
Next I tried Red Hat 8.0. I managed to deselect everything possible and get the install size down to about 800MB, but this is already latest versions, so that's OK. Problem is, X didn't work at all, neither did PCMCIA. I want this system running well by next weekend, so I went back to 7.2 and didn't run any up2date. I now have Linux, Gnome and Window Maker available for 750MB.
That still seems a lot to me, so I've been looking for smaller distros. The problem is, I need something for a newbie like me. I don't want to customise a slackware install and then slowly recompile the kernel and add packages to give me tailored minimal functionality. It's not hard to understand - I just want to know why there isn't a "minimal graphical" install available for a main distro? All I need are core components, networking, PCMCIA, X and Window Maker, and I'm happy. I just don't want to use more than, say, 500MB of disk.
So I found RULE, which is a project to allow people with crappy hardware to run Up-to-Date Linux (Red Hat only at the moment, but more on the way). This may be the answer I've been looking for.....
As I said, I need the craptop to be running decently next week, as I plan to use it for a fortnight as a network security tool (nessus and nmap are essential) and also to maybe dabble with shell programming basics. After that, RULE might be given a try..... I'll post here if and when that happens :)
Wednesday, January 08, 2003
Tuesday, January 07, 2003
For the rebuild: a Red Hat 7.0 install guide by someone who knows their onions.
Thursday, January 02, 2003
The Linux Terminal Server Project - might have a play with this, as the crap laptop could just become a thin client.