After hardening my CentOS install, I noticed this showed up hundreds of times on the console:
ERROR: ld.so: object '/lib/libsafe.so.2' from /etc/ld.so.preload cannot be preloaded: ignored
The fix is easy enough. Use vi to edit the file /etc/ld.so.preload and comment out the only line to make it look like so:
#/lib/libsafe.so.2
The changes will take effect immediately. I'm not sure what exactly caused the error, but libsafe.so.2 (which is symlinked to libsafe.so.2.0.16 in my CentOS 5 install) is in it's proper place; the system can function without it
Monday, February 21, 2011
Thursday, July 1, 2010
Computers are actually more inacurate than you think
And I'm not talking about your average programming error, either. These are problems with people, not computers. But a great example of errors in the core hardware of systems is how computers convert numbers and how accurate those conversions actually are. Essentially, the error is insignificant when used in simple math like addition, subtraction, multiplication, division, and modulus, and rounding usually solves the logic. The only place it fails is when it's tested. For example, it would be safe to say that 99.9999999 is in fact 100, and that it is in error by not being rounded. In fact, most computers automatically round this number up when displaying it. However, the number is not rounded when it is tested or compared, so 99.9999999 == 100 evaluates to false.
A perfect example of this is how computers represent fractions and small decimals. Take a look at the following program flowchart:
start:
a = a + (1/10)
loop to [start] 10 times.
At the end of the program, a should logically equal 1. Work it out - if you add 0.1 or 1/10 ten times, the answer should be 1. Written in perl the above program would look like:
#!/usr/bin/perl
for ($num = $i = 0; $i < 10; $i++) { $num += 0.1 }
if ($num != 1) {
printf "\n$num = %.45f\n", $num;
}
effectively this program displays the variable $num after it has had 0.1 added to itself 10 times, except the option %.45f tells the computer to display the variable as accurately as possible. The output looks a little something like this:
So, 1 in fact is equal to 0.999999999999999888977697537484345957636833191 in the mind of a computer. While most of us would dismiss this as too insignificant to make any sort of difference, this could mean the world to precise sciences such as chemistry, where parts per billion and parts per trillion are pretty common place.
The error is in converting 1/10 into base two for use by the computer, as it must represent it as a decimal it therefore looses accuracy in the conversion. For those of you with macs who have no idea what I'm talking about, you can go ahead and try it yourselves. You'll produce exactly the same result. Go to your applications and search for terminal. Open it, type perl, paste the program in above, and press control+D
A perfect example of this is how computers represent fractions and small decimals. Take a look at the following program flowchart:
start:
a = a + (1/10)
loop to [start] 10 times.
At the end of the program, a should logically equal 1. Work it out - if you add 0.1 or 1/10 ten times, the answer should be 1. Written in perl the above program would look like:
#!/usr/bin/perl
for ($num = $i = 0; $i < 10; $i++) { $num += 0.1 }
if ($num != 1) {
printf "\n$num = %.45f\n", $num;
}
effectively this program displays the variable $num after it has had 0.1 added to itself 10 times, except the option %.45f tells the computer to display the variable as accurately as possible. The output looks a little something like this:
So, 1 in fact is equal to 0.999999999999999888977697537484345957636833191 in the mind of a computer. While most of us would dismiss this as too insignificant to make any sort of difference, this could mean the world to precise sciences such as chemistry, where parts per billion and parts per trillion are pretty common place.
The error is in converting 1/10 into base two for use by the computer, as it must represent it as a decimal it therefore looses accuracy in the conversion. For those of you with macs who have no idea what I'm talking about, you can go ahead and try it yourselves. You'll produce exactly the same result. Go to your applications and search for terminal. Open it, type perl, paste the program in above, and press control+D
Thursday, May 27, 2010
Getting the Gateway CX and M series tablets working in Ubuntu 10.04 Lucid Lynx
After installing, you must first configure the serial interface. Open synaptic package manager from System > Administration > Synaptic and install the package setserial. Mark the package for installation and press apply to install it.
To configure the tablet, open a terminal from Applications > Accessories, and paste the following code:
sudo gedit /etc/serial.conf
A text editor will open. Paste the following line:
/dev/ttyS0 port 0x06A8 uart 16954 irq 4 baud_base 38400
Save and close the file.
Next we install the actual fpit drivers. As of 10.04, the fpit package provided in the ubuntu repo has some dependency errors and will not install. To get around this, you'll need to set up the X Updates repo by running the following command in terminal:
sudo add-apt-repository ppa:ubuntu-x-swat/x-updates
Then you can open synaptic and hit the 'reload' button to refresh the repositories. Search for the fpit package and install it.
Next, the X server configuration must be written. The following configuration should work for most CX systems. If the following configuration does not work, then follow the instructions here to generate a configuration file, then add the layout to the server section later. You should only do this if the following configuration does not work. In a terminal, run:
sudo gedit /etc/X11/xorg.conf
If you do not have an nvidia graphics card, this file should be blank. Paste the following configuration:
# xorg.conf (X.Org X Window System server configuration file)
#
# This file was generated by dexconf, the Debian X Configuration tool, using
# values from the debconf database.
#
# Edit this file with caution, and see the xorg.conf manual page.
# (Type "man xorg.conf" at the shell prompt.)
#
# This file is automatically updated on xserver-xorg package upgrades *only*
# if it has not been modified since the last upgrade of the xserver-xorg
# package.
#
# If you have edited this file but would like it to be automatically updated
# again, run the following command:
# sudo dpkg-reconfigure -phigh xserver-xorg
Section "Monitor"
Identifier "Configured Monitor"
EndSection
Section "Screen"
Identifier "Default Screen"
Monitor "Configured Monitor"
Device "Configured Video Device"
SubSection "Display"
Virtual 1664 768
EndSubSection
EndSection
Section "InputDevice"
Identifier "Generic Keyboard"
Driver "kbd"
Option "XkbRules" "xorg"
Option "XkbModel" "pc105"
Option "XkbLayout" "us"
EndSection
Section "InputDevice"
Identifier "Configured Mouse"
Driver "mouse"
Option "CorePointer"
EndSection
Section "InputDevice"
Identifier "Synaptics Touchpad"
Driver "synaptics"
Option "SendCoreEvents" "true"
Option "Device" "/dev/psaux"
Option "Protocol" "auto-dev"
Option "HorizEdgeScroll" "0"
EndSection
Section "InputDevice"
Identifier "Tablet"
Driver "fpit"
Option "Device" "/dev/ttyS0"
Option "InvertY"
Option "MaximumXPosition" "12550"
Option "MaximumYPosition" "7650"
Option "MinimumXPosition" "400"
Option "MinimumYPosition" "400"
Option "SendCoreEvents" "true"
Option "Passive" "false"
Option "TrackRandR" "on"
EndSection
Section "ServerLayout"
Identifier "Default Layout"
Screen "Default Screen"
InputDevice "Synaptics Touchpad"
InputDevice "Tablet"
EndSection
Section "Device"
Identifier "Configured Video Device"
EndSection
Save and close the file. Finally reboot your system and you should have a fully functioning tablet.
Sunday, May 2, 2010
Ubuntu 10.04 Lucid Lynx - Safest Ubuntu Ever
Aside from the new theme, the included video editor, and the new social media client in the latest release of Ubuntu, there is one feature that I believe can win in any argument for Linux - security.
For the longest time, Unix based software such as Mac OS X, BSD, and Linux have had very good security settings and procedures, and have not been targets for the common malware, viruses, and trojans which plague the windows operating system. In many Linux versus windows arguments, the fact that Linux can not get viruses has been brought up, and to an extent that is completely true. It's not just that attackers only write viruses that target windows machines, but that the active development, open nature of the source, and basic methods of the operating system has made *NIX platforms safest for the home user.
First of all, because Linux systems, from the software all the way down to the kernel, are completely open source, the code is under constant surveillance and fresh, new eyes that watch out for things that might trigger buffer overflows or exploitable memory leaks. The code is much more hardened than that of closed source systems because they have a limited set of coders and a very broad audience, some of which is searching deliberately for breaks in code in which to take advantage of. Little quirks in powerfully dangerous things such as activeX scripts and even things as common as Adobe Reader are quickly broken by hackers. A few weeks after the exploit is found, code is quickly written and launched through file sharing sites and black box web servers to be consumed by the general public. If the exploit is executed correctly, millions of machines will be effected before large security firms begin to catch on to it. Even if it is caught, it could be months before the bug which caused the original exploit to be reported and fixed by a company.
Next, the super user hierarchy in *nix systems prevent malicious scripts from doing anything... well malicious to the core system. Permissions and binaries are locked down, modifiable only by the root user - which by default is completely disabled in modern Linux desktops. Now don't get me wrong - a system can still be hacked into, given that services such as ssh, vnc, and ftp are running on a system. Weak passwords are 70% of the cause - however a general rule of thumb is that if someone has enough knowledge to launch systems like ssh, they will know to use complex passwords and systems like denyHosts to secure it.
Finally, Ubuntu has become more secure because of a new feature called /usr/bin/cautious-launcher. This is the default in which a file with any sort of executable extension, such as potentially dangerous executables which could be run by WINE. An (not)- emulator doesn't exactly constitute a need for anti virus: A virus run in wine would be jailed to the home directory, and would be very confused by the bulletin libraries. Additionally, the virus would be dead by the next restart, as there is no model in wine that lets programs run themselves at start-up. Such actions can only be configured by the user or by natively installed software*.
So the basic security requirement in the latest version according to https://wiki.ubuntu.com/
is the setting of an executable bit on all files with potentially executable extensions, but only if the file in question resides in either the /home or /temp directory. cautious-launcher's job is to check a file's permissions, and if the executable bit is missing, it warns the user and instructs them on how to make the file executable. This has increased the desktop security of Ubuntu dramatically, and in addition to systems such as the SeLinux kernel, have made desktop Linux even more attractive to the security minded.
I don't know about you, but I have enjoyed not needing to buy any security software suite for the past seven years.
*note that software can only be installed in Ubuntu either through the software centre and PPAs, which are signed and verified with RSA public key authentication, with deb files which also have a signing algorithm, or manually by compiling from source (expert). All software install procedures require super user privileges.
Sunday, September 27, 2009
Video editing
One of the key things people have approached me about is "Will Sony Vegas work?" Of course it always ends up being a pirated version and so the answer has always been no. (Winners don't Warez) I've recently gotten started up again in the wonderful world of video and definitely can say for certain there is much being done to improve free video editing in the Linux world. Kdenlive, even though has not hit 1.0 yet [currently 0.7.4], is amazingly powerful. It contains all the features and effects that you can expect in a commercially available piece of video software, except it isn't heavy on the what I call shinyness of the GUI itself (A MAJOR reason that high end video software completely sucks on a PC, like how Pinnacle Studio 10. Really shiny, but extremely slow and lacking in features. Since it was written for KDE, it's shiny already =D) Instead it brings together the best in everything having to do with the audio and video subsystems of Linux. Unlike an older, less maintained video editor I use called Cinelerra, there is one window for the entire program, and the live preview is quick and responsive despite how many effects are piled on. It works great with Pulse Audio so no sound problems here. Another great feature is in the way that it renders the final product. Most commercial applications use their own rendering engine, and therefore produces only a limited number of output codecs and efficiency varies from program to program. Kdenlive uses ffmpeg, a powerful command line rendering system that can be scripted. Since it can be scripted it is extremely easy to render a long video across multiple machines - your very own render farm. Got two computers? Render it in half the time. On a side note, I've been looking into a way to simplify a renderfarm system to be submitted to the development staff.
Subscribe to:
Posts (Atom)