Do you depend on your computer for your living? If so, I’m sure you’ve thought long and hard about which hardware and software to use. I’d like to explain why I use generic “white boxes” running open source software. These give me a platform I rely on for 100% availability. They also provide a low-cost solution with excellent security and privacy.
People’s requirements vary, so what I use may not be the best choice
for you. I’m a support
person for
databases and operating systems. I also do consulting that
involves research, presenting, and writing. I use my own computers and
work from home. This article is about desktops and laptops, not
handheld devices.
Replaceable Hardware
I need 100% system availability. If I don’t have a functioning
computer
at all times, I can’t do my job. I’m unhappily “on vacation” if
I’m fixing my computers. My solution is to use only
hardware I can fix or replace immediately.
One could adopt other strategies to meet these strigent hardware
requirements. Some pay more for higher quality equipment, betting that
this
results in fewer failures. Some rely on vendors for support. They
select a responsive company with a good reputation for service.
Knowledgable help is vital. Many prefer local support staff who are
easily accessible. Thom Holwerda wrote an excellent
article explaining why he picks iMacs for high availability.
I take a different approach. I use generic white boxes with all stock
parts. Since computers are inexpensive I keep several on hand, along
with extra parts. It’s easy to swap parts if
necessary. PCs are highly standardized — if you acquire them with an
eye to non-proprietary components. I open up and inspect every machine
before I use it. (Watch it
with laptops.
Some vendors will mold their DVD drives to non-standard
shapes or add proprietary plastic you have to
fit on your hard disk to properly connect it.)
For my self-service approach to work, you have to know how to perform
basic hardware problem
identification. You don’t need to be hardware-trained. I’m
not. The key is to be able to quickly identify common problems,
because the
hardware fixes are easy with a replacement
strategy. A good problem ID
procedure and a few rules of thumb are all you need. (I’ll share mine
in another article if people are interested.)
If a hardware problem requires more than a few minutes, use a backup
computer. Once this was prohibitively expensive. Today cheap generic
boxes make it feasible. Another change from years past is that you no
longer need current hardware to run current software. I run
resource-heavy apps like enterprise DBMS and website
generators with a
few
gig of memory and a low-end dual core processor. That’s a five year old
machine. You can get a fleet of them for the cost of one hot new gaming
box.
Critical to my approach is that you keep your work — your data —
portable. Back it up
and move it between machines with a USB memory stick. Don’t ever get in
a
situation where your data resides only on a single machine. Same with
software. If you depend on certain applications for your work, ensure
they’re available on more than one machine.
To do this just copy data directories or entire partitions
between computers. If you
need a certain application or configuration for your work, copy it. If
a USB memory stick isn’t big
enough to hold your copies, use a USB hard disk. Or, perform
network copies. I run them in the background while I do other work.
Virtual machines are also useful. Just move guest OS
files between VM hosts. Virtualization lets you easily, safely, and
securely run multiple OS’s
on one computer.
Vendors are well aware that generic hardware and portable
software threaten their profits. That’s why most proprietarize any way
they
can. Unified
Extensible Firmware Interface (UEFI) is the latest of many attempts to
kill
competition by an artificial barrier. The rationale for UEFI
lockdown you often read
about — that it prevents boot viruses — is intended to mislead. The
last time
secure booting was a major problem was back when people booted from floppies.
It’s not boot viruses you have to worry about, it’s those within Windows that cause the
problems.
Applying this Philosophy to Software
To apply this philosophy to software, I use stock parts that can
easily be installed, copied, or replicated across machines and backup
devices.
There’s a name for such software: open source. While open source
software (OSS) saves you money, flexibility and licensing are the big
benefit. You
control it, it doesn’t control you.
Let me give you a single example: backup and recovery. In Windows
World, there must be a dozen ways to recover a lost system (off-hand, I
can think of the Recovery Console, System Backup and
Restore, recovery partitions managed by OEM software from vendors like
HP or Dell, the Last Known Good Configuration, Safe Boot mode, Registry
Export/Import, and performing a Repair Install). Why so many
different ways to solve a single problem?
The answer is that vendors want to control your backup and
recovery.
Otherwise they
can’t lock you in and make you a source of continuing revenue. Vendors
claim “ease
of use” — but is it really when you face this tower of B/R babble?
With OSS, I issue a
single command to either backup or recover. I don’t have to navigate a
half-dozen different apps designed to “help” me.
Here’s a real-world example. My motherboard died last summer. I removed
the boot disk from the dead system and plopped it
into another, then booted that Linux instance on the target computer.
Problem
solved! Windows won’t let you do this. Its hardware-bound Registry,
authentication procedures, and licensing all specifically prevent it.
They’re designed
to. Why? So you don’t steal Microsoft’s software. Microsoft
places its needs to protect its ownership of Windows software above
your need to
solve
your crisis. (Remember, you do not
own the copy of Windows you “bought,” Microsoft owns it. You only
licensed it.)
Microsoft has every right to protect its property. But
that’s not our problem. Our problem is fixing our motherboard failure.
Because of their agenda, Microsoft makes our life more difficult.
Their software limits your flexibility — on purpose. Heck, you can’t
even move an installed app from one disk to another
without special software. The Registry — Microsoft’s
control choke point — prevents it.
OSS lets you easily move software
across machines or disks or operating systems with just a command or
two. I replicate operating systems, applications, and data how and when
I need to. No Registry, licensing, authentication, hardware binding, or
other artificial
barriers make my job more difficult.
Here’s another tip: Don’t use an operating system you don’t install.
There was a time when a
vendor-installed OS meant peak performance and a malware-free system.
Those days are gone. Major incidents
have shown that preinstalled malware is now a reality, ranging from
spyware to rootkits to adware to craplets.
This problem will get worse before it gets better.
Security and privacy require that you control your computer. If you
use an OS someone else installed, you don’t control it.
Compatibility
Most of
the business world uses Microsoft’s desktop software. So a big issue
for those using my strategy is compatibility. How will
you fit into Windows World? The answer depends on the kind of work
you do.
For some IT professionals, this means running Windows and the Microsoft
stack. “Use what
your clients use.” I hear you and agree 100%. Do what you need to do.
For most people, however, compatibility merely requires file interchange. I’m in this
group. All we need for compatibility is the ability to create, update,
send, and receive Microsoft Office files.
Using LibreOffice, I’ve encountered very few problems in exchanging
word
processing and spreadsheet files.
Just stick to the features
common to both LibreOffice and MS Office and
avoid complex formats and layouts. The web has many articles
on how to use LO and MS Office compatibly. (Ironically, LO is often more compatible with older versions
of MS Office than is the current version of MS Office!)
The compatibility picture isn’t quite as rosy when it comes to
presentation graphics. Move a 40-slide PowerPoint file between office
suites and
you’ll see many minor changes (spacing and fonts,
for example). I circumvent this by presenting to
clients with my LibreOffice laptop and handing out hardcopies of the
foils.
Years ago, I used to double-check how my OSS-produced files looked
on Windows XP. For example, I’d check that a
Word document I created with OpenOffice looked
the same in MS Word, or I’d verify that
web pages created with Kompozer and Firefox rendered properly on
Internet
Explorer. I don’t know whether it’s because OSS compatibility has
improved, or that I’ve learned how to avoid incompatibilities, but I
haven’t bothered with double-checking for a long while.
Applications availability is another concern. Do all the
products you
need run under Linux? Everything I need runs natively. For some folks
Microsoft products are an important exception, since all are
Windows-only.
You can usually solve this problem with Wine, a compatibility app that runs
nearly 20,000 Windows programs
on Linux.
Business Savings
I’m an independent consultant. What works for me
may or may not work for you. Or for small or large businesses. Still,
when I see how some companies operate, I wonder if they’re
wasting money. Many could remain on Windows while strategically
replacing components to their great advantage. This avoids a
disruptive platform change while capitalizing on open source tools and
apps.
Office suites are the perfect example. Microsoft Office licenses
are not cheap, especially for smaller companies that can’t swing the
big discounts. LibreOffice and OpenOffice are functionally very
competitive. You really
have wonder why more companies don’t even evaluate them.
Some would answer: support. But what kind of support do you get
from a vendor that you can’t get from the Internet? I’m old enough to
remember when vendors created bug fixes for customer problems. Today
they just tell you to wait for the next release (which they always
insist you install, whether or not it fixes your problem). Support
consists merely of work-around’s and how-to’s. You
can get that online for free.
Another possibility is to keep Windows but
replace
Microsoft’s proprietary development environment. Leave the
ever-shifting
sands of Microsoft’s frameworks in favor of open source IDEs,
programming
languages, tools, and databases. Some companies score good savings
while producing excellent apps
with WAMP (Windows + Apache + MySQL + PHP/Perl/Python ).
These ideas aren’t
for everyone, but it always amazes me that some IT pros are so
tightly wrapped in the vendor security blanket that they don’t even
evaluate alternatives. Some security blankets are well worth the money.
Others only
represent inexperience or inertia. Only
you know which statement applies to your organization.
The Bottom Line
Inexpensive stock
parts work well for my hardware and software needs. They’re easily
replaceable so I enjoy 100%
availability. Low cost, high
security, and good privacy are extra benefits. What are your
requirements and what desktop strategy
do you use?
– – – – – – – – – – – – – – – – – – – – – –
Howard Fosdick (President, FCI) supports databases and operating
systems and consults as an industry analyst. Read his other articles here.
Like this article? Please spread the link love to Slashdot, Digg, LXer,
or wherever!
Switched from WindowsXP to Linux over 10 years ago. I’ve never looked back. And, you’re right about the hard drive swap from one motherboard to another. I’ve done the same several times now. Even if the video driver at bootup is for a different card than the original, the OS will switch automatically to a generic vesa or other generic driver.
What happens if the hardrive goes?
You can be back up and running with Windows or any other operating system if you plan for it.
Windows Vista/7 and 8 can also deal with that. I have had the same Windows installation cross motherboard hardware … the only common denominator was the processor was intel.
I think you’re grasping at straws here. I wasn’t referring to a dead hard drive. I was referring to moving a hard drive from one motherboard to another and proceeding without interruption, as indicated by this portion of Howard’s article:
You actually *can* do this with windows, I have done it numerous times.
Guess what, I did this with BeOS even longer ago.
No one cares. This shouldn’t be a feature point on a chart. It should be assumed that a system is designed well enough to cope with such a change.
I just recently did the swapping to get both crunchbang and windows 7 installed on an old laptop of mine that doesnt support USB booting “hurray”.
The problem is that it is pretended as though it is a Linux/*nix only trait.
When in fact any OS can be setup to do it. My criticism is not with the OS it is with the article that pretends that a major selling point of an alternative OS is this … when it tbh it isn’t.
Edited 2012-11-26 23:07 UTC
Exactly my point
But I got down-voted even though what I said was 100% valid.
Yes, that is unfortunate. I only down vote if its actually warranted. User voting always ends up with little factions. Really annoying
And your post is 100% valid and accurate. Windows has booted into generic VGA of no drivers are available since its inception and, starting with XP, into a VERY fast VESA. Windows VESA seriously gives BeOS/Haiku a run for its money.
I remember reading about Windows 95/98 doing pretty darn well with the bundled drivers (near-eternal “Building a Driver Information Database” dialog aside) too.
Giving him the benefit of the doubt, I can only assume he’s praising either:
1. Linux bundling 99% of its drivers into a default install.
2. Linux’s lack of a hardware-tied activation system under any distro. (We don’t know he’s talking about a version using a volume license key)
Still can be done on modern Windows without many problems other than that you run with no Hardware acceleration for a while.
Shame Office 2010 wasn’t as smart as Win7 with regard to moving. I can kind of accept that I can’t install a Pro license on the new machine without unregistered the license from the old machine. One licese, one install; fair enough.
What I can’t understand is why the uninstall program does not unregister the license during the process or even provide the option. And if you do uninstall, you can’t re-install back on that same machine again because your license is flagged as already registered. WTF.. it’s the machine it was registered to run on in the first place. So, I can’t re-install or move the license to a new machine without calling Microsoft on the phone and asking permission? This is a Pro license for F sakes.
Now, I have to go make a phone call and try not to throw up in my mouth during.
Edited 2012-11-26 18:09 UTC
I’ve done this often.
One Windows 2K3 array I’ve moved hardware three times without a hitch. My current desktop switched pretty much everything, including processor mfg, when I cloned it over to the new one. It worked fine, from big stuff all the way down to the color calibration of my monitors.
I think it is a bit of a myth that Windows can’t deal with this. From Windows 2000 on, it seems pretty solid. I never tried it on desktops back in 3.x – 9x days, so I can’t comment. Maybe it didn’t work back then. I did move NT4 around few times, and it worked OK, but I did not do that often.
Edited 2012-11-26 18:13 UTC
It actually works with 9x as well. Windows *can* be REALLY finicky, but, in general, it works fine if you do it correctly.
They even had “Hardware profiles in previous versions of Windows.”
Good point! I’d forgotten about those. I think they were mostly used for docking stations, but I bet not exclusively.
They can deal fine as long as you stay in the same architecture. Example upgrade from an old Intel Core DUO to an i7, or from an AMD Athlon II/Phenom to a FX.
But if you do cross-architecture upgrade, then good luck with that.
Linux does not only survive that, it will boot without problems due to kernel having most drivers build in. The only one that will probably need re installation is the proprietary video driver and only if the card chip family changes (ie AMD->NVIDIA)
Edited 2012-11-27 15:40 UTC
Not a problem since Vista/2008. I do plenty of swaps between AMD and Intel architectures. The only trouble I’ve had was storage drivers. (You need to make sure drivers for the boot device/controller are installed before swapping — Linux would have the same problem).
But since most storage drivers are within the Linux kernel anyway, and most distributions create a generic boot image (a fat one btw) is very very uncommon to have any booting issues switching from one hardware to another.
Like I said before the only real issue with Linux will be the X display and that depends on gfx adapter. But the system will very likely boot fine.
I went from a Core 2 to an FX* without any issue. Stunningly, all my paused Windows (and Linux) VMs resumed without error, even though the processor type changed underneath them “hot” (CPU type is one of the things passed through to the guest).
Storage usually is the most delicate, at least if anything goes wrong, but I’ve moved whole drive arrays and Windows coped. Linux usually works too.
They both seem to do a real good job of handling this.
For me, the fly in the ointment in a move is almost always the NIC. On Linux it tends to break all the network config, and on Windows most of the time it doesn’t have the driver. Fortunately neither one is that hard to correct.
*Technically those are the same architecture though different types.
Edited 2012-11-27 20:37 UTC
If you do just that, the only “problem” when you do this, in Windows the old NIC and IP-configuration remains in the registry. Occasionally I’ve seen it cause problems.
In Linux it’s just udev that needs to be told that the new nic should be used in place of the old one.
I once swapped memory sticks and changed the video card on my desktop. And the windows xp started asking me to get new license. The original point made by the author is very valid. Windows OS license is tied to the hardware.
Yeah, but depending on what license you have for windows, transferring the same OS installation among different machines may or may not be legal.
Something which is a non-issue with Linux or BSDs, for example.
Technically, however, both Linux and Windows are a mixed bag when it comes to having an installation work across different machines/configurations. At least from personal experience.
Edited 2012-11-28 02:50 UTC
Still, when I see how some companies operate, I wonder if they’re wasting money.
Absolutely. I once contracted at a [large company] and saw first hand how the IT department was screwing the rest of the [company] out of money. An exclusive contract with one of the name brand PC manufacturers required upgrading every single device every three years. There were 30,000+ devices in this company. Each device was sold with a Windows 7 Professional COA that went unused because the company had also purchased an Enterprise license from Microsoft for Windows XP+Office 2k7. As of last year, we were still rolling out a standard 32-bit XP image, on quad-core machines with 4GB+ of memory, because legacy software required XP to run.
You might imagine that a place that invests so much $$$ in its hardware and software stack would be using a lot of specialized software, right? Well, kind of. Most users either used a Java application that ran on a remote server, or another software that was a Telnet session to an IBM zServer (the telnet client was proprietary and required a separate paid license).
How did I fit in? The 30k users had all sorts of daily issues with their machines, not due to hardware issues (rare), but due to viruses and malware. Remember, these are full Windows XP systems being used to launch a Java Application and Telnet session. In the background, they still have IE7 (IE8 was still “unproven” as of 2011) which allows them to download whatever the fuck they want onto their system. Thus, end users would render their machines completely unusable due to the shit they were downloading. Standard practice was, if after two hours, you couldn’t remove the malware (which you couldn’t because you were required to use the same antivirus that let the fucking machine get infected in the first place), the solution was to re-image the machine. Five hours of work total, and yeah, we billed by the hour.
I realized early on that a lot of the environment could be replicated easily and cheaply using FOSS. Any five year old Linux system could run the Java app and Telnet session (I tried this as a proof of concept) and would be much cheaper to support. In fact, most of the software devs were running RHEL in a virtual machine on their devices. At the very least, the company didn’t need to purchase two versions of Windows for each Machine (I later found out the decision-maker behind this policy was given her choice of free take-home machines by the hardware manufacturer and also given a MSDN subscription so that she could brag about how she had the awesome new Windows 8 before everyone else).
I have since moved on to a less demeaning job at a place with a much more sensible IT policy, but I completely understand how the IT department can be a ponzi-scheme that sucks the lifeblood out of the company. The place I just described was a hospital, one of the top-rated hospitals in the country at that, and the cost of all that IT bloat was being rolled into the bill for the end customer, which is really a sick child, cancer patient, or a dying grandparent. Wasting money? Not for them. Somewhere in that $5k a night you are paying for this kind of nonsense.
i have seen that scenerio so many times, i will add that many will also let the SA expire on a Microsoft volume contract thus costing the company more.
Ive been in the same position although my approach was different, i replaced the Antivirus with a better solution, it found all of the viruses and killed 99% the 1% we reimaged. I then upgraded the machines hardware and OS to Windows 7 and so far no viruses or malware, in part due to the antivirus but in part also due to the better administrator privilege mechanisim in Windows 7 which allowed us to put users as standard users without messing up there apps and easy escalation to administrator which only the IT administrators could do.
Both i think are good solutions, i did look at FOSS however it couldn’t meet the requirements of the organisation.
The only other thing i would add to the article is to make images or backups of the software your using, in particular the OS, in many cases and indeed illustrated in the article you can fix Linux/’inx’s easily, it’s a powerful feature, however in the cases when you have to reinstall it’s best to install to the same OS your using, what i mean is that there are new releases all the time and some of the newer releases break things (wifi etc), so keeping the ISO of the OS you installed and was working right is a must as relying on a distrubution to keep a copy of the same OS may cause you to be out of luck.
I would go further and say that if your job depends on your work then stick to LTS, or other long supported systems.
(also i appreciate a lot of users will also mirror there setups onto DVD/backup disks).
The article was a good read though and i agree on all the points, i would like to have a job where i wasn’t so reliant on Windows as i like to have a good mix of OS’s and would like to work where Linux was in more use just to spice things up, however im not going to whine about it as Windows 7 is rock solid and does what i need it to do!
I just wish Office 2003 wasn’t completely broken on Win7 64bit. (why an MS product won’t work on an MS OS.. I’ll never understand) Can’t buy winXP licenses, can’t afford to migrent everyone to Office 2010 at the same time.. weeee.. fun.. every week starts with another “XYZ froze on me and I lost work” complaint.
Bit of a hack, but in Win7 can’t you install it in XP Mode, and then use “unity”–vmware term, I forgot the MS one–to integrate it into the start menu like normal?
No XP Mode in Win8.
I am running Office 2003 on Win 7 64 bit. I have zero issues. Honestly your post saying that Office is broken surprised me. What exactly is broken?
If Win7 is so nice, maybe it’s time to change your avatar? ;p
There’s a lot about this article that I disagree with, but this one in particular is a doozy:
Some would answer: support. But what kind of support do you get from a vendor that you can’t get from the Internet
Well, how about (a) access to engineers who know what they’re talking about, and (b) a guarantee that if a problem can’t be solved over the phone, we’ll have engineers on-site within an hour for as long as it takes until the problem is solved?
Some companies need that level of support because they’ve got billions of dollars at stake.
BeamishBoy,
“Well, how about (a) access to engineers who know what they’re talking about, and (b) a guarantee that if a problem can’t be solved over the phone, we’ll have engineers on-site within an hour for as long as it takes until the problem is solved?”
Although implied by the article, I don’t know if it’s really fair to compare free OSS support versus paid commercial support. If you need paid support, then pay for it, if free support is good enough, then it’s good enough. It probably depends alot on how a particular company feels with outsourcing rather than having staff capable of managing all critical IT operations. Ether way, you can often get unpaid support on proprietary products and paid support on open source ones.
I wouldn’t disagree with you. I do, however, disagree with the author of the article since this is precisely what he implied.
What a load of rubbish (highlighed).
Sorry most Microsoft produced languages and APIs work in most cases from Windows 2000 to 8 with very little problems … I still have VB6 apps that work perfectly well.
.NET has been solid since version 2 (.NET 1.1 still worked fine with Windows 7/2008 R2). SQL Server is backwards compatible to SQL 2000.
While there is nothing wrong with Open source stuff, why move your existing and working code for the sake of it?
Edited 2012-11-24 21:32 UTC
lucas_maximus,
“I still have VB6 apps that work perfectly well.”
Funny, I still have nightmares from vb6 active-x hell.
For the most part I think backwards compatibility for *userspace* code in windows is remarkably stable, especially given the 64bit transition. It’s not perfect though, and old things are breaking. I was at a shop that was heavily invested in vb6 & activex for numerous products. One of my roles involved deploying these on new platforms, and they definitely encountered numerous incompatibilities, incurring high maintenance costs.
As a side note: citrix is workaround many of the compatibility problems we encountered, albeit at some expense.
“NET has been solid since version 2”
.net apps are not without occasional problems either, sometimes they install on one system but not another (ie when both are winxp3). I gave up installing one .net app on a new win7 home computer even, but I never solved the problem (probably a localised issue).
“While there is nothing wrong with Open source stuff, why move your existing and working code for the sake of it?”
That’s just it though, I wouldn’t ask people to convert for no reason. If something works, then great (no sarcasm here). But when old things start breaking, regardless of who’s at fault, it may be worth looking around at what else exists and pre-emptively avoid another proprietary locked in solution.
I could do with a lot more backward compatibility from win7 to Office 2003 (compatibility mode?.. rubbish). It is the bane of my existence supporting users right now.
To avoid lock in by M$ that’s why.
.net kind of sucks for desktop applications, Lazarus and free pascal is much better in my book.
Even for web based you can do all the same stuff with Python and Cherrypy for example or flask or whatever.
The corp IT departments are full of morons…They make decisions based simply by something they read in a magazine or read online with no clue on how things really work. They love to give massive amounts of cash
to M$ and they love M$ SQL server when PostgreSQL is clearly better for just about everything, same thing with web servers, they drool over IIS, and then wonder why it was hacked into in a matter of hours….
Really? .net apps sure beats Java apps at least. Then again, gouging your eyes out with a spork beats using Java apps on the desktop.
Sure, and you could do the same in asm if you wanted but that’s not the point.
Sure, you could do it with Perl CGI’s too.
True that
Better is a relative term. it’s not necessarily “better” if I have to spend a lot of money and time porting existing apps to it.
It’s not 2001 anymore, IIS isn’t horribly vulnerable these days.
You start to lose credibility when you say MS …
IIS has less vunerabilities reported recently than Apache.
Microsoft SQL Server is pretty good. The IDE for it is one of the better SQL Editors that I have used. PostgreSQL only has compatibility last time I checked with Microsoft SQL 7, not sure about Oracle with PL/SQL.
Edited 2012-11-26 12:47 UTC
That is an arbitrary (and silly) veiled ad hominem, which also makes you lose credibility. So it’s a draw in the credibility (or lack there of) department.
“PostgreSQL only has compatibility last time I checked with Microsoft SQL 7”
Would be nice to know what you mean.
PostgreSQL is one of the most, if not the most, standards compliant database out there.
Obviously a lot of databases don’t stick to the standards or they create their own (extensions) first before it is proposed as a standard.
OMG… try selling Lazarus and Free Pascal to any large company… you’ll be laughed out of the board room. You might just get away with Delphi by mentioning Borland, but even Delphi has dropped like a stone in popularity.
Disclaimer: I did just over 10 years worth pro commercial Delphi development between 1997 – 2008 for 4 different employers(Banking sector 3 years, Consultant/Trainer 2 years, Logistics Back office 2 years,Patient Transport/Security Patrolling/Auditing 3.5 years.) Delphi jobs dried up and there is no real call for it any more. We need to be realistic here.
So what remains in the “enterprise” is Java and .Net ?
Java is they teach in school, so that is what they’ll use. And .Net was made to look like Java.
I just make everything webbased and compliant with webstandards (a lot of time on the server at the company and only accessible from there).
That is how I make things platform independant. Even works on your mobile.
As the creator of Javascript said: always bet on the web.
I don’t know that it’s the entire IT department. Granted, the tech staff can only do what management allows and.. and there’s your problem usually.
The places I’ve worked have lots of difficulty keeping up with all the changes in Microsoft’s software. Say you left for five years and came back to your employer… how much of your code would still be in place? How much would have had to be rewritten? Most of it, I bet, whether it’s Microsoft or open source tools you used.
When it comes to new machines, I buy from these guys:
http://www.pugetsystems.com
They’re a ’boutique’ shop, and their prices reflect this fact, but when you buy a machine from them, you can give them specific instructions, such as telling them exactly the way you want the hard drive partitioned. Then, they put the machine through a variety of stress tests, send you photos of the machine as they’re building it, update the bios and all the drivers, etc. Basically, when you get the PC, it has zero crapware and is ready to use out of the box. The build quality is top-notch, and the PCs are whisper quiet.
IF I ever have a problem with any of my machines, there’s a local guy in town who will come and get it, take it to his shop, fix it, and then return it a day or two later.
Of course, I could always build/maintain them myself and save quite a bit of $$, but for me, it’s worth paying somebody else so that I don’t have to deal with hardware bullshit. (I rarely, if ever, have any major software issues.)
As for the OSS side, you highlighted a lot of the problems yourself. When software you need is only available on Windows (or Mac), well… what choice do you really have? And while you state that it is easier to move a setup from one PC to another, you fail to mention that Linux is more of a pain in the ass in about three dozen other different ways. And what benefit would I have for switching, besides a bunch of stated problems that I’ve never had?
As you guys get older, you will come to understand that time is the most valuable commodity that you have, and to spend money in order to save time is often times worth it. For example, if I’ve got two pieces of software that accomplish the same task – one of them costs $400 and the other one is free, if the free solution takes 3x longer to get the same task accomplished, and it is something I have to do often, then I will take the $400 solution every time, all other things being equal, of course.
I guess the takeaway here is that the solutions are not automatically better just because they’re cheaper. Some of them are, of course, but you get what I’m saying.
Edited 2012-11-24 21:42 UTC
WorknMan,
Depends on your own income and efficiency, does it not?
I can see how someone who’s well off would just prefer to pay others. But if it took you a week or two to earn $400 of disposable income, then in theory you might be better off spending a day to do it yourself.
Maybe you’d still prefer to spend more time at work than less time on tasks you don’t like. But if your goal was to maximise family time, then you really ought to be factoring in how much time you’ll need to use just to earn the money that will pay someone else to do it.
Yeah, it certainly does depend. For example, I’m not going to pay $400 to save two hours of time. On the other hand, I would probably do it if it saves me two hours every week.
Obviously, one has to consider the cost vs efficiency ratio. And yes, sometimes I do pay somebody else to do a task which is more than I would make in the same amount of time, mainly when it’s something I REALLY don’t want to deal with. For example, I paid 2 guys $100 to set up a power rack that I bought, and it only took them an hour, since they put these things together for a living. I do not make $100 an hour But it would’ve taken me at least an entire day to do the same thing, and would’ve been a complete pain in the ass. I pay somebody else to change the oil in my car for the same reason. It just all depends on the situation.
My point is that it is my belief that too many people are of the opinion that saving money on software is always a good thing, no matter how shitty or disfunctional said software is. Granted, sometimes the free or cheap option is better (or at least good enough such that more expensive options don’t provide you with any real benefits), and that’s great. But when it isn’t, you should really stop and think about how much your time is worth. We can always get more money, but we don’t have the option of getting more time.
Edited 2012-11-24 23:31 UTC
WorknMan,
I hear you.
Additionally, my luck with cheap devices is getting worse than it used to be. I’ve needed to RMA monitors, power supplies, sata adapters, ram, hard drives, etc. My experience doesn’t prove a trend, but I’m inclined to believe that manufacturers are racing to the bottom by cutting costs sourcing the cheapest quality components & workmanship they find.
It’s impossible to tell which products are solid based on price. Sometimes good brands fail as well, unfortunately consumers don’t have access to failure rates. Expensive devices will sometimes use the exact same boards under the hood as unknown brands. Never the less, I’ve decided to pay higher prices to try and increase the odds that I won’t have to waste my time dealing with a lemon.
Edited 2012-11-25 04:28 UTC
Hm, looking at the past through somewhat rose-tinted glasses again? ;p (IMHO; and that is about what the http://www.osnews.com/thread?543085 article mostly is to me, going beyond economic stats, mostly talking about how badly societies remember their past conditions; in a way, it was also “right” in pointing out the general economic craziness WRT goals and such)
There were also tons of lemons in the past (8bit micros had meagre reliability), but we remember those less than the examples which survived longer and/or are still working.
I have been using open source since the late 90’s, but I still buy software when I see the need for it.
And I do it, because as a software developer I also need to buy stuff, and not everyone will pay me to work on open source stuff.
Now if people are not willing to pay for software, why do you care to pay for hardware? This is the question it keeps popping on my mind.
Ideally you don’t want to depend on anyone for the full stack, while getting everything for free (gratis).
Because unlike hardware, software doesn’t cost anything to make copies of. I’m not saying that is a legitimate excuse, but that seems to be the rationale.
I think you should re-read the article, I think they point it: price is a bonus, if a Linux was only available from RedHat at $250 which you can install unlimited on as many machines as you want.
Then I think people would still feel the same.
(obviously, Linux being vendor neutral and free to try and experiment helps to make it spread. I’m sure that is why newer versions of Windows have a ‘grace period’)
Edited 2012-11-29 11:13 UTC
I guess it depends where you will use the $400 product.
Most of the time, when I need some piece of hardware, software of service that costs more than €100, it’s for work. I consider that employers should be paying for work tools, and generally I have no issue convincing mine that considering how much I cost him per month, if something that is worth a fraction of that cost can truly make me more productive (which I have to demonstrate), he can pay for that.
Even if you’re self-employed, the reasoning still holds: if fixing your stuff costs you more money, in the form of work time, than having someone else fix it for you, then you should probably choose the latter option.
(As an aside, this is also a reason why I am strongly anti-BYOD. In my view, this is just a way for your employer or IT department to have you pay for computer maintenance costs that they benefit from.)
Edited 2012-11-25 08:49 UTC
Neolander,
“I consider that employers should be paying for work tools, and generally I have no issue convincing mine that considering how much I cost him per month, if something that is worth a fraction of that cost can truly make me more productive (which I have to demonstrate), he can pay for that.”
Never pay for yourself what you can have someone else pay for instead!
IMO, I’d rather do the BYOD thing, if it means I can run whatever software I want, as opposed to having that dictated to me. For example, I hate working in a locked-down corporate environment, where I have to use apps like Windows Explorer, and can’t install any 3rd party tools. If it means I have to pay for those tools, I’m good with that, since I’m probably already using them anyway at home.
Yup, annoying sysadmins suck. So far, those which I have met at work have been quite reasonable (either giving us an admin account on our machines, or accepting to install well-known software like Gimp or Firefox on demand), but I have heard horror stories about those who aren’t.
I’d argue that even having your own laptop around won’t save you from them though, because they have control on much more than just individual machines. Ever worked in one of these places where they block the IMAP and SMTP ports (and their SSL version) for “security reasons” that they won’t explain ?
Edited 2012-11-26 08:02 UTC
Yup, I work for such a company. They also, for some inexplicable reason, decided to block all the balloon/popup notifications in the taskbar. I had a ton of scripts that were using this feature, so had to write a custom routine for all of them to get around the block.
If you bother to question any of it, ‘corporate policy’ is the standard reply. So apparently, some asshole way up stream is making all of these decisions, whom we can never speak to, and who never makes exceptions under any circumstances.
Edited 2012-11-26 08:03 UTC
Those guys should try working for a videogames company (like I do).
Anything we do that blocks the productivity of the ‘workforce’ is frowned upon and as such we don’t have any draconian policies or revokation of rights at a PC level.
This place runs smooth as silk without them.
You likely have a higher average user skill level than most non-IT industry places.
Windows also has problems in a myriad different ways, it’s just that you’re used to deal with them.
It’s interesting how many people, especially in the corporate space, that thinks the exact opposite: if it’s expensive it must be good.
Very true. I remember one of my managers requiring to use Oracle DBMS just because client company have heard of “how reliable it is”. And we only had like 7 tables…
Edited 2012-11-25 11:40 UTC
Coming from an open source advocate I missed the bashing against Apple, Cisco, Oracle, Sybase, Intel, SAP and any many others I don’t bother to list.
-troll mode on-
Commercial software is evil run away! Everything is going to be taken care by university students on their free time.
-troll mode off-
-troll mode on-
Open Source is evil and/because there’s no money in it. Everything will be taken care of by commercial interests that really, we promise, care about whats best for YOU
-troll mode off-
-troll mode on-
What is this “Open Source” you speak of? Sounds mighty close to communism.
-troll mode left on-
Edited 2012-11-25 04:53 UTC
This is a very good, high quality and well written article. Thanks for giving me opportunity to check on other pro’s policy.
I’m using FLOSS exclusively myself. Linux, *BSDs and other FLOSS operating systems.
However, the most important thing for me is data independence. I like it to be able to receive, read, write, manipulate my data no matter what platform I’m on. That’s why I always go for simple solutions, which are more likely to be portable [like DD+gzip for backups, plain text files for information storage, etc]. It also makes the whole thing smaller, more compact and easy to move around.
I don’t need to mention about multiple backups I make.
DD+gzip is your solution ? You copy the blockdevice ?
I just use a rsync(based) solution, that makes it easier to restore a just one or a few files.
rsync all the way over dd…
I will say that Windows really isn’t bad at this, though certainly less generic. I had a file server fail recently, and simply mounted the latest backup VHD on another server and tweaked the cname to point to the new place. Only took a couple of minutes, and everything kept working without interruption as far as the client desktops were concerned until I could bring up a proper replacement.
Was kind of cool really.
DFS replication could have made it all seamless, but that takes a lot of overhead.
Edited 2012-11-29 22:42 UTC
This is a little off topic but Gnome has decided to bring back the traditional desktop elements due to the overwhelming rejection of their modern UI
Microsoft should take note of that, because they are going to suffer the same backlash and people will look for alternatives.
Where did you hear that?
They’re offering it as an option for GNOME 3.8 — they’re dropping fallback mode, which is kinda sad (because it tends to work better inside a virtual machine or on older hardware) but they’re bolting a traditionally-shaped desktop into the composited environment for those who want it:
http://www.webupd8.org/2012/11/gnome-shell-38-to-get-classic-mode.h…
The VM story was already worked on for longer than that.
Here is an other solution which is coming:
http://www.phoronix.com/scan.php?page=news_item&px=OTM0Nw
There are more in the story and there are others.
But it isn’t heaven yet:
http://www.phoronix.com/scan.php?page=news_item&px=MTIxMTg
LLVMPipe is kinda funny anyway:
http://www.phoronix.com/scan.php?page=news_item&px=MTA2NjY
Developers do know and work on it:
“Developers hope Unity 6.8 will improve things a bit by taking care of some visual and performance problems. One of the main performance fixes is trying to improve the performance of the Unity desktop when using the Gallium3D LLVMpipe driver as the fallback method for software accelerated when no supported GPU/driver is detected. However, Unity on LLVMpipe may still be too weak for the ARM desktop.”
http://www.phoronix.com/scan.php?page=news_item&px=MTE5OTk
Edited 2012-11-29 12:31 UTC
I prefer to use FLOSS whenever possible, but I’m afraid I have to disagree with your position on OpenOffice.org / LibreOffice. We ran about a 6 year experiment of using OOo and later Libre. I was arguably the biggest proponent, although it was our CEO’s desire to avoid the Microsoft Tax that made it actually happen. (He cares nothing about FLOSS philosophy, just wanted to avoid the ~$300 per seat for MS Office.) We managed to limp along. Most files were viewable, if occasionally horrifically rendered, and we had the free Office viewers to fall back on. But anything complicated required finding one of the three people who were still allowed a copy of Office, and we sometimes pissed off customers when we failed to realize something important was missed due to formating or macro errors. About 3 weeks ago our CEO finally gave in. Even I have thrown in the towel. OOo / LO has worked well for me when I didn’t have to exchange files. MS Office has no equivalent to Draw, and I vastly prefer OOo / LO’s equation editor. But making large, auto-indexed Writer documents was essentially impossible, and our sales department has a very complicated Excel spreadsheet with about 20,000 lines of macros that we never managed to convert over anyway. (Those the were the guys still allowed to have Office.)
As good as LibreOffice is (I haven’t tried OOo since the Oracle / Document Foundation split and subsequent Apache takeover) it has some failings. Ultimately, interoperability is the biggest obstacle, but not the ONLY problem.
So, yeah, I prefer to use FLOSS. But I’m not going full Stallman.
That mirrors my experience. Open/Libre Office would be fine in a vacuum, but Microsoft Office’s stranglehold on the market has little to do with the programs themselves, but rather the file format.
When inter-operating between MS Office and O/L Office, many little formatting issues would crop up here and there: A paragraph that should be single spaced but shows up double, a PowerPoint with an off-center picture, a spread sheet with unreadable formatting.
If you put any value on your time above say, minimum wage, it’s cheaper to pay the Microsoft tax then to spend the countless hours fixing those maddening little problems in document after document after document.
>When inter-operating between MS Office and O/L Office, many little formatting issues would crop up here and there
They also crop here and there even when inter-operating between MS Office and MS Office. I remember one day when colleague brought me MSO document typed in the same version of MS Word as I had, and there were formatting issues here and there.
The main problem is how the documents are really formatted. If the user uses several pre-defined style templates, the document will be rendered and printed correctly even in OOo. Or it will be very easy to fix it just by changing style a bit.
But much more frequent case is: text and pictures are formatted using tabs+spaces combination until they are properly aligned in this particular instance of text editor. Some pictures are inserted in presentation using absolute/relative values inappropriately.
And if you presented something in PowerPoint format, you may easily remember that you should triple-check your presentation on the same computer where you present it, otherwise, some magic can happen.
After all, at our faculty (where Windows prevale a bit) there is even the recommendation to publish presentation to PDFs for the defence of dissertations instead of PowerPoint, because otherwise some formulas are unreadable or improperly displayed, or there could be encoding problems, or someone would bring PowerPoint 2003 presentation to display in PowerPoint 2007 and have a lot of fun, etc.
It is the same for all WYSIWYG software, that if improperly used, WYG on another machine is not the same as WYS on current machine.
I’ve seen an issue here and there between MS Office versions, but they’re incredibly rare. Issues between L/Oo and MS, however, are the norm. Usually just little issues, but they’re enough to drive one mad.
Usually when sending documents to other parties, they’re sent to be reviewed and edited, comments added, etc. PDFs would work in maybe 15% of the situations, tops. And even then, it’s extra work, and it’s just plain a hassle because it’s not a working format, just an output.
Like I said, if you value your time even a little bit, it quickly makes sense to pay the MS tax, as loathsome as I find it.
>Like I said, if you value your time even a little bit, it quickly makes sense to pay the MS tax, as loathsome as I find it.
And Munich case is the great example of this, whoa:
http://www.h-online.com/open/news/item/Linux-brings-over-EUR10-mill…
your post proves you didn’t read the paper
just a few things:
they state that munich is upgrading their hardware in a 5-year cycle, so they exclude hardware-cost from the linux side (but happily include it for windows).
they include 4.2 million € für 30k office-licenses (upgrade and full) and 2.65 million € for windows (again upgrade and full)
if you break that down to per-license (140€ for office, and 143€ for windows) you will notice that they used full retail prices and not the cost for a volume-license for this calculation
and they pay more for windows than for office
Edited 2012-11-27 19:42 UTC
My post proves that sometimes there is the real need to specify tag “irony” explicitly. What this article states is that choosing free software and re-training personnel is still cheaper than paying MS tax.
On the other hand:
“The German city of Freiburg is preparing to dump its long-running use of the OpenOffice suite in favor of a return to Microsoft’s Office after struggling with range of document compatibility problems.”
http://www.infoworld.com/d/applications/openoffice-dumped-german-ci…
Wouldn’t a more sane strategy be to use LO/OOo internally and not export them outside of the company? Any document needing to be exported could be a PDF. LO and MSO (with a free plugin from MS) both can export as a PDF.
If you wanted to take it further, you could put a filter on the mail server not allowing doc/xls/ppt files and giving an error to the sender that such files are not allowed due to viruses and to try re-sending them as PDFs. This should fix the problem of opening stuff in LO/OOo and not seeing all of it. If people *really* need to send an MSO file, they can zip it.
If an office file needs to be edited on both internally and externally, shouldn’t it be in Google Docs or a web app?
That is precisely what we did. However, PDF was not always a workable solution. Customers often wanted editable files, especially with spreadsheets. At any rate, most of the issues were from docs sent to us, not what we sent out.
As for Google Docs, I’d be perfectly happy with it. I don’t get to pick. I have to work with my customers, not force them to work with me.
Again, my customers would not accept this. I would literally go out of business inside of a year if I refused documents from my customers like that. We have 50 employees, and many of our customers are Fortune 500 level. Seriously, if you sent an RFQ out to a vendor 1/100th your size and they complained they don’t want your Microsoft Word file, would you bother with them ever again?
Blocking clients essentially like spam? And some people wonder why OSS advocates are sometimes not taken seriously…
(if anything, a filter of LO/OOo files in the other direction would be a more plausible solution – work internally in LO/OOo if you want to, but make sure to communicate with the outside world in MSO files, from the few computers which have MSO)
it also fixes the problem where you have to deal with customers since you won’t have any.
Hi Howard, great article. I recently installed Debian on my laptop to dual boot and might even put it on my desktop now I am becoming more familiar with it. Which distro do you like to use for home/work?
I am very interested in seeing another article like this covering your trouble shooting procedures. I imagine there are one or two that are particular to linux/bsd. Cant wait to learn something new.
Oh the other thing on my mind is finding a robust backup strategy that can be applied to Linux. You obviously have it sorted, would love to read more.
Edited 2012-11-25 04:48 UTC
I can appreciate the time & thought put into some of these types of articles, even if they don’t offer anything new. For every pro-whatever piece written by person A, person B always feels the exact opposite and writes about pro-whatever_else.
After all the complaining about Windows from linux users, and all the complaining about linux from Windows users, and all the complaining about both from people who use both (like myself), it can always be summed up with one single sentence that has been written a million times… Use what best suits your needs. Also, make sure you understand that most hardware & software is YMMV — even the stuff with a stellar track record.
Another possibility is to keep Windows but replace Microsoft’s proprietary development environment. Leave the ever-shifting sands of Microsoft’s frameworks in favor of open source IDEs, programming languages, tools, and databases. Some companies score good savings while producing excellent apps with WAMP (Windows + Apache + MySQL + PHP/Perl/Python )
WTF? Switch PHP/Perl/Python to get away from shifting sands? What a joke, anyone who works on LAMP servers knows damn well how much open source software expects the latest PHP, MySQL, etc or it will puke up some random error that you have to track down. The open source world has nothing like .NET when it comes to backwards compatibility so Fosdick might want to re-think his advocacy pieces.
Oh and no one runs WAMP stacks anymore when we have VMs. It’s just too easy to hang yourself thanks to aforementioned dependencies. Too many open source web developers are hard linking to not just LAMP dependencies but also specific versions of CENT/RHEL. Before you get your panties in a bundle that is actually good news for Linux. All these dependencies create intertia for LAMP stacks. It’s the shifting sands that lead to RHEL in hyper-v instead of a WAMP or WIMP stack.
Edited 2012-11-25 06:53 UTC
Maybe with PHP but for sure not with any of the other mentioned alternatives. At least not any more than their Windows counterparts.
Not saying you should throw out your IIS and .Net apps and switch to Linux and what not but your comment is just as clueless as the one you’re complaining about.
Well, that’s good news. WAMP was always pretty horrible but that comes with the territory, so to speak.
That’s not my experience. It would seem developers have moved on from the CentOS/RHEL stone age to distros that aren’t stuck 5+ years ago.
LAMP is PHP land and there is no equivalent to .NET on the server.
I work on both LAMP and .NET professionally and it’s not my decision to “throw out” anything. I also know what the hell I am talking about since I deal with this problem throughout the year.
Here is an example:
Zencart (one of the top shopping carts)
Zen Cart v1.5.0
Minimum server requirements:
PHP 5.2.14 or higher, or PHP 5.3.5 or higher.
Apache 2.x or newer (Specifically the latest PCI Compliant version)
PHP 5.2.14 came out in 2010. Why should a shopping cart be dependent on the latest version of PHP and a specific series of a web server? Why is there an Apache dependency? What if I don’t want to use Apache?
This is the norm in Linuxland. Everyone builds against latest since there isn’t anything like .NET to maintain backwards compatibility. The standard strategy is to get latest and tell anyone who has software dependent on PHP or MySQL N-1 to f themselves. I know this first hand since I’ve had to fix a lot of PHP code that had dependency breaks or was version abandoned by the developer.
Well you don’t know much about LAMP development then. What developers would like to use and what they build against for business reasons are two entirely different things. CENT/RHEL is the standard for web servers and going outside it increases the conflict risk. That means higher support costs.
Again don’t get defensive since all these annoying dependencies benefit Linux when it comes to web servers. It creates inertia and discourages stepping outside the norm.
ze_jerkface,
“Well you don’t know much about LAMP development then. What developers would like to use and what they build against for business reasons are two entirely different things. CENT/RHEL is the standard for web servers and going outside it increases the conflict risk. That means higher support costs.”
I think Soulbender would already agree with your complaints about PHP, as do I. But you are exaggerating the difficulty of using alternate linux distros for the server. It’s practically plug and play no matter which distro you use. Also, I haven’t had much trouble replacing apache with alternates like lighttp either, just because it’s not officially supported doesn’t mean it doesn’t work.
I’m not saying we should step outside of “supported” installations willy nilly, but if there is a good reason to then it’s certainly feasible.
Yes it is plug and play for a basic setup and the same is true for Windows Server and FreeBSD. Want CPANEL? It’s only supported in CENT/RHEL. Sure you can probably get it working in Debian but then all it takes is a single module break down the line to make you wish you stayed in the norm. LAMP software is more dependent on CENT/RHEL than it was 5 years ago. The LAMP world is not some hippie software exchange; it’s filled with commercial companies and developers that have limited resources and can’t afford to test in every distro.
That doesn’t fly in the business world. You don’t stake your reputation on unsupported software.
Edited 2012-11-25 18:25 UTC
That’s because Cpanel is a horribly badly written piece of software and a disgrace to system admins everywhere.
No it’s a cost saving measure. They don’t want to support every distro and they don’t have to given the popularity of CENT.
Saying “CPANEL sux” won’t make it go away anymore than saying “Office sux” when someone points out it doesn’t run in Linux. Intertia will continue to build around CENT/RHEL/Oracle because of software like CPANEL. Sorry but that is reality.
That may be true but Cpanel is still junk. Heck, even Webmin is better although admittedly Cpanel is not as horrible as Plesk.
I’m sure it will continue to exist and be popular in certain circles though.
Also, even though RH/CNT is officially supported it’s still incredibly fragile and easy to break and if you’re doing *any* kind of configuration management (puppet, cfengine, chef etc) you can forget about it. Cpanel is really only useful for people who want to re-sell hosting to mon’n’pop companies. Nothing wrong with that but Cpanel has no place in any other setting.
Except Office (I presume you mean Microsoft Office) is an actually useful and reasonably good product. CPanel isn’t.
Oracle? As in Oracle Linux? Hah. Hahaha. Sure, and the Pope will convert to Islam.
Perhaps in the “enterprise” space RH/CNT will remain popular but “enterprise” apps and software is generally quite awful and out-of-date so it’s a good match.
For everyone else who’s even remotely agile RHEL/CENT (especially the 5.x series) is a dead chapter.
Well, not quite. Amazon Linux is quite good and up-to-date for a CentOS-based distro and as long as AWS is popular I guess that alone will keep CentOS alive.
Edited 2012-11-27 09:50 UTC
Now we are getting to the crux of the issue. CPANEL is used heavily for shared hosting. Those hosts setup RHEL/CNET LAMP environments to use CPANEL which creates inertia. It’s not about whether you like it or not. It’s just one of many factors that have made RHEL/CENT the de facto distro for LAMP stacks. As inertia builds more software is only supported/tested in this environment and stepping outside of it increases the conflict risk. From a business perspective you also have to consider long term risk. In 5 years any new LAMP software will have to run on RHEL/CENT for it to be profitable. It’s extremely low risk when it comes to compatibility and support.
Perhaps? There is only one enterprise distro that the Fortune 50 will consider. The remaining competition is between support providers, with Oracle being #2.
You have a point and this is why we don’t use this kind of hosting. We used to and boy was it ever a bitch to deal with cpanel. Even on the supposedly supported platform.
My point is that nothing lasts forever. I see developers increasingly frustrated with having to cater to RH stone age versions of everything and pushing for a move to greener pastures. Especially in smaller, more flexible companies.
The amount of software that ONLY support RLEL/CNT is very, very small. Most support the major distros like RHEL, SuSE and Ubuntu.
I’m glad I don’t have to care what Fortune 50 considers good because most of the time, well, it isn’t.
ze_jerkface,
“LAMP software is more dependent on CENT/RHEL than it was 5 years ago.”
I have to disagree, I personally haven’t had a single problem working on lamp projects across multiple distros (PHP’s own version specific breakages aside). Is there actually something specific that’s been giving you trouble? If so, maybe we can help? If not, then what exactly is your evidence that LAMP is dependent upon RH/CENT?
“The LAMP world is not some hippie software exchange;”
If we take “P” to stand for PHP, it might be a bit hippie actually
“it’s filled with commercial companies and developers that have limited resources and can’t afford to test in every distro.”
So what? It doesn’t contradict what’s been said. Just because a LAMP package isn’t supported on X doesn’t mean it can’t run on X. I’ve been routinely developing LAMP software on my ubuntu desktop and deploying it to various servers including CENT/Debian for years and not once has that caused a problem. There was once even a mac server.
I swear that I did not know this before today, but my own shared web hosting provider is running Debian. It’s never even mattered enough for me to check before. It has little impact on what I do.
Of course, these distros have different approaches to administration and installing packages. But I’d hope that anyone who chooses to use distro X is familiar with how to manage X packages. If that’s too much to ask, then you are really setting the bar low.
Edit:
“That doesn’t fly in the business world. You don’t stake your reputation on unsupported software.”
Well, alot of independent developers (particularly web developers) have a job precisely because the open source code clients want to use is *unsupported* and they want *us* to support it. Not to overstate it, but I hope we can agree there’s a bit of truth to this?
Edited 2012-11-26 04:46 UTC
LOL just ignoring PHP breaks when PHP constitutes the vast majority of plug-ins, modules and third party LAMP software like webmail and DALs. So when working with LAMP projects one only needs to avoid anything written in PHP to avoid breaks? Wow, thanks Captain. Great, everyone go write your own frameworks and plugins from scratch. Try not to build major PHP and MySQL dependencies this time.
Going to go with the patronizing attitude, eh?
Linux fans really have a hard time with criticism and that is a major understatement.
I never said LAMP is dependent on CENT/Red Hat. I said that using anything but CENT/RHEL for LAMP increases your risk of conflict. Running software on an unsupported system increases your risk of conflict. CENT/RHEL is more likely to be supported than any other distro when it comes to third party LAMP software. Ergo….
How many times do I have to repeat myself? I have never denied that but this is what can happen:
You: Hi I think I found a bug in your software that is causing me problems.
Them: Which OS are you using?
You: Not one that is supported.
Them: Goodbye.
It’s more complicated than a package management problem. Most websites are on shared/managed systems and this is why there is so much third party LAMP software that is delivered directly from developers as plug-ins or modules and built against bleeding edge / common environments. Everyone is basically on a mindless treadmill.
I agree that there are economic opportunities for unsupported custom solutions. But for LAMP those custom solutions are built on standard base.
ze_jerkface,
“LOL just ignoring PHP breaks when PHP constitutes the vast majority of plug-ins, modules and third party LAMP software like webmail and DALs. So when working with LAMP projects one only needs to avoid anything written in PHP to avoid breaks? Wow, thanks Captain. Great, everyone go write your own frameworks and plugins from scratch. Try not to build major PHP and MySQL dependencies this time.”
You made a claim that LAMP is heavily dependent upon RH/Cent, that’s all I’m refuting.
“Going to go with the patronizing attitude, eh?”
Not really, I was playing your bluff. If you haven’t actually had any bad experiences, then the next statement becomes mostly hypothetical.
“I never said LAMP is dependent on CENT/Red Hat. I said that using anything but CENT/RHEL for LAMP increases your risk of conflict.”
I just haven’t seen it, and you probably have not either. As a professional web developer I would be able to provide support for this conflict on behalf of my clients should the need arise. If your policy is to insist on RH/Cent, that’s your prerogative. I would too *if* supporting alternatives became difficult, but it simply hasn’t.
“How many times do I have to repeat myself? I have never denied that but this is what can happen: You: Hi I think I found a bug in your software that is causing me problems. Them: Which OS are you using?”
Even CentOS itself is an *unsupported* clone of RH, yet it still gets plenty of use because it works. Same goes for debian or any other distro.
I was not able to find any OSS LAMP packages whose official support was limited to Oracle/RH/Cent, can you list a few? If not, will you admit that your argument is more hypothetical than actual?
“I agree that there are economic opportunities for unsupported custom solutions. But for LAMP those custom solutions are built on standard base.”
Where have you seen LAMP defining RH/Cent as a standard base?
Look, I don’t really want to haggle over this any more. Can we agree that it’s ok to use whatever distro is desired as long as the web developers are ok with said configuration?
Edited 2012-11-27 16:06 UTC
You answered your own question in your example: PCI Compliance. I’m dealing with this exact issue with a client right now. I’ve almost convinced her to leave Zen Cart behind for a sane solution, but she has a love/hate relationship with it; she loves the power and flexibility, but loathes the compliance issues. If it weren’t for the PCI Compliance nightmare, I’d be content to support her Zen Cart instance, but it’s making us both pull our hair out.
No that doesn’t answer my question because you could have PCI compliance without dependence on a specific web server version. It also doesn’t explain why a shopping cart needs the latest PHP.
Most LAMP software lacks data abstraction layers and backwards compatibility. Why is it so hard to admit this? It’s not like Linux is threatened on web servers.
Sorry, I thought it would be obvious. In software like Apache web server and PHP, vulnerabilities are found nearly every day, necessitating patches and new versions to be put out by the developers. I for one would certainly not want to be running a web app like Zen Cart, which is responsible for handling customer’s private data and in some cases financial records, on older, unpatched software.
Perhaps you are working in the wrong field?
Quite amusing, it’s all just for pci compliance, huh? So I should be able to swap in any pci compliant web server without any problems? Are you really going to tell me that with a straight face?
A cute defense and I see by your points that many OSNews readers bought into it but it’s just another flailing attempt at trying to downplay the longstanding problem of dependency hell in Linuxland.
I can see you are too sensitive to be able to discuss this issue with intellectual honesty. I’m used to that with Linux fans, no big deal.
Edited 2012-11-27 08:38 UTC
When did I ever say that? Making up stuff to argue your point is, well, pointless.
No argument there, it’s something I deal with no matter which distro I’m using at the time.
What the hell are you going on about? Look, I know you think no one knows you’re just nt_jerkface with a new account; after all that old account stopped posting around the time you started with this one, and the name is a clear giveaway. So from our past discussions under your old handle, you should remember that I’m fairly platform agnostic, unlike you. In fact, my favorite OS (as you well know, old friend) is BeOS and by extension Haiku, followed by OS X.
But anyway, I’m just telling you what you already know. Shame on you for trying to troll under a new moniker though; it’s very bad form my friend.
I asked why it should be dependent on a specific series of a web server. You said PCI compliance but that doesn’t answer the question. Why does it have to be Apache? You make it sound like it is purely a certification issue when it has Apache dependencies. Here I dug one up for you:
http://forum.nginx.org/read.php?2,230616
Sorry Sherlock but there is no mystery. I actually went over this already. I’m not trying to hide anything, why do you think I only changed two letters? The “NT” was giving ambiguous implications which is why I changed it. Go back and read through my posts with this account and you’ll see that I just wanted a slight name change. Or just ask Neolander, he welcomed me back and I said thanks.
Which platform? Windows? That must be why I have slung more hatred at Windows 8 than any other blogger.
I believe in the platform of progress. Windows 8’s Metro and Linux’s dependencies both deserve a place in hell.
.htaccess files? Wow, seriously? That’s some seriously badly engineered software you got there.
This just re-inforces what I already said though; it’s not a php/mysql/lamp problem. It’s a developer problem and I will readily admit that most PHP apps are utter crap.
To be honest, your problem seems to mostly be with PHP and not with any other part of that stack. I can understand that and that’s why I don’t use it. Unless I really have to and boy does those times make me want to rip my hair out.
The OP talked about PHP/Perl/Python.
OH MY GOD! You found an application with some stringent requirements. Wow, good thing there are no .NET apps out there that require a specific .Net or IIS version….
…wait. Does it depend on the latest version or 5.2 and 5.3?
Have you actually tried using it with something else? We have FPM these days and I’m still to find a PHP app that ran with mod_php and not fpm. Maybe zencart is horribly written and won’t work outside mod_php but that’s a developer problem, not a PHP one.
Also Apache 2.x or later is not a specific version and if you’re still using Apache 1.x you have bigger problems.
In other words, a developer problem and not a PHP problem.
Yeah, sure I don’t. Personal attack already? Argument not going well?
That’s an imaginary risk. In terms of web development the differences between RHEL/CNT and, say, Ubuntu Server or SuSE aren’t significant.
The major difference being you can target older versions of .NET even if the framework or IIS version has been upgraded. There is no pressure to target latest since there is only one framework and security updates are applied uniformly. It’s a clean system but on web servers the inertia is behind LAMP so Windows Server ends up with the same chicken and egg problem as Linux on the desktop.
I used it merely as an example since it is commonly used. You’ll find most of the competition to Zencart also targets a very recent version of PHP. It goes back to the treadmill problem that I was talking about.
It’s only imaginary if you think support risks are imaginary. Can all distros get a LAMP stack up and going through the repository? Of course, but the risks exist in third party software especially where there is a commercial drive to sell to shared hosts where the typical environment is a CENT/RHEL based LAMP setup. There is also a lot of Oracle stuff that isn’t supported in minor distros and I’m sure that will get worse over time. In fact I fully expect Oracle to pull a rug shake at some point where they make Unbreakable Linux look like the safer choice for web hosts.
And why are you using an old version of PHP. PHP developers maintain 2 branches (5.4, 5.3) of their software. For each one they do a point release every month or 2 to fix issues (being security most of the time). So having an old (and probably vulnerable) version exposed to the outside world is not very wise.
What you need to do if you use a Linux distribution that is release based, raise the voice and ask them to provide point releases instead of “backporting” bug fixes which usually came late, if they came after all.
Edited 2012-11-27 15:30 UTC
Backwards compatibility and security are not mutually exclusive. With .NET hotfixes simply fix the code without requiring anyone to move to a newer version.
FYI I don’t maintain older versions of PHP. But I have had to fix plenty of PHP code that was built against an older version. Sadly there are tons of web framework plug-ins and themes that were built without any regard for maintainability which makes the problem even worse.
Maybe RHEL is the standard in Linux servers, but their long term release is not the best choice for all the packages that form the RHEL distribution. Yes, a LTS Kernel/Core packages mean a stable system, but why I want backported patches to all the web apps it provides. They don’t even trust that system on their own servers. For example, check in RHEL/CENTOS repository which version of Bugzilla they have available for install. I’m sure they don’t have the last one, which in fact is the one currently running on their own bug tracking server.
https://bugzilla.redhat.com/
Also (and sorry if I get you down of that cloud), (1) Developers test against the upstream packages, that’s why a package said it requires at least X.Y version, and (2) Who told you upstream developers user RHEL in the first place. They usually use more lean and clean distributions (the lest fat, the better).
Edited 2012-11-27 19:51 UTC
I think you and others here have taken my criticism as an attack on all non-RHEL distros.
RHEL/CENT is the top choice for web hosts and the best choice for low conflict risk for lamp servers. That doesn’t make it the best distro by any means and in fact I haven’t met anyone that actually likes it. RHEL is like the Windows 98 of Linux Distros. It’s a safe choice for compatibility and support but it didn’t win the top spot through technical merit.
I’m talking about the commercial LAMP industry, not Linux developers in general. That industry targets the most common environment which is RHEL/CENT.
I prefer to own first-class laptop. Sometimes you just need to take it with you, and desktop is not the case.
Of course, it runs GNU/Linux distribution. (Default Windows installation on this laptop had bloatware such as Bing toolbar pre-installed, and also at least two checkboxes checked-by-default to “send anonymous statistics”. Seems that several years later computers with proprietary software will send daily several gigabytes of outgoing traffic of “interesting” data to the software owner.)
Tablets are proprietary by desing, so I dislike this device type at most (except e-books, of which I don’t care whether they run proprietary thing or not – such device just needs to have great screen for reading and long-life battery, that’s why mine is Kindle). Also they currently cannot connect to wired networks and external storage, and that’s why tablets suck twice times.
Desktop for me is the fallback option, and also is the playground and multi-multi-boot system (if I need something exotic, such as Windows or Solaris or even Haiku, but in non-virtual environment, I use it).
All my high-capacity storage is pluggable, so I don’t depend on what type of computer I currently use.
I almost agree with you. I bought a Sony PRS-505 because the PRS-700 lost some contrast to its touch screen and I didn’t trust the original Kindle to be 100% happy with Linux and 100% willing to let ME control which eBooks got “revoked” off my SD card.
“Another possibility is to keep Windows but replace Microsoft’s proprietary development environment.”
Actually, Development is one of the reason why to move TO windows. Visual Studio is simply one of the best dev environments around (obviously “in my opinion”) and .NET is extremely powerful too.
>Visual Studio is simply one of the best dev environments around
VS itself is good for a total newbie to create a crazy inefficient app with several mouse-clicks, but which will work.
And for the senior developer it does not really have any sense which text editor he will use. The only thing that really matters is the number of advanced features that is here out-of-the-box. As a productivity tool, VS is not-so-good without addons such as ReSharper.
>.NET is extremely powerful too
Not more powerful than Java platform, if to be honest.
Yes, programming languages are far more expressive, but sometimes it leads to non-efficient code and strange design much more easily than in more strict languages.
Little Java humor here:
“When I see that the application I run is based on Java, I’m really happy” -No User Ever
“Knock knock?” “Who’s there?” -5 seconds later “Java”
Also, I want to find the person responsible for creating the pop-up that reminds me to update Java, and punch them in the face.
>Also, I want to find the person responsible for creating the pop-up that reminds me to update Java, and punch them in the face.
Just Windows-user problems.
>”Knock knock?” “Who’s there?” -5 seconds later “Java”
-20 seconds later:
“.Net here, too. Just I’m late
Because my Windows rebooted after critical update.
I try to never vote people down these days, because it seems childish and I’d rater see people voted up and the trolls and bad posters down at 1 vote.. But you sir are a total troll.
VS is not the perfect IDE, but it’s a darn sight better than most. C# is not the perfect language and the CLR is not the perfect VM, but is trounces Java in every way.
>But you sir are a total troll.
>C# is not the perfect language and the CLR is not the perfect VM, but is trounces Java in every way.
I’m troll in sense that I’m currently involved in .net/c# project. (For me the most important thing of work is fun, and atmosphere, and people around me. And also I don’t say that I dislike .net. I dislike that it is almost always Windows-bound, and I dislike Microsoft’s stable policy to enforce user’s choice.)
As for .net vs java – .net apps are frequently full of p/invoke’s. At home I’m Linux-only user, and I cannot use many .net apps due to winapi dependencies. But as for java, most of ready-to-use apps run fine.
Also, I really like C#, but Java gives less chances for a mistake, especially on the level of application architecture. Less feature-rich thing is always easier, in sense that you may know everything in detail. And you must know how the basics work, otherwise, you can create inefficient crap easily. So, the learning curve is longer for high-feature-rich things, and it is not always needed, because applying some easy patterns in less-feature-rich things frequently gives the same good result, and also it is proven to be good.
As for tools – I simply don’t know direct analogs for Maven in .net world (no, NuGet is the parody on such a tool).
Regards to the auto-updater: I have people out looking for him and also the one who did the Adobe Acrobat updater. I particularly love the Acrobat one, because right-clicking the tray icon which normally gives you the menu to close the craplet, actually just brings up the Acrobat updater window again.
It’s amazing to me how limited some devs are that deal with OSS. They have obviously never actually done any serious work with .NET. Those old cliches about ineffecient apps from VS are just outdated rubbish now. They just make the people who make them look ill-informed.
ASP.NET MVC is IMHO up there with ROR in terms of robust (and fast) website creation. Although I will admit that the C# language has some ground to cover before it can challenge Ruby on brevity and dynamism. However, the amount of work that you can have done for you is immense in VS.
I really do think that MVC will be one of the products that will keep MS alive. Cause it sure as sh!t ain’t going to be Windows 8.
Although I will admit that the Ruby language has some ground to cover before it can challenge C# on performance.
The only thing you said which wasn’t total nonsense was “VS itself is good for a total newbie”.
>The only thing you said which wasn’t total nonsense was “VS itself is good for a total newbie”.
Yes, the other things I said are really total nonsense for those who use VS only and develop for Windows only.
I don’t say that anything is good or bad. But trying to move someone to Windows development in the thread devoted to free and open source software, I think, is just one more nonsense here.
http://www.microsoft.com/visualstudio/eng/products/visual-studio-ex…
And your point is?
And my points are:
http://www.jetbrains.com/idea/
http://netbeans.org/
http://www.eclipse.org/
All of them are free and feature-rich.
And also all of them (and software created using them) work on:
Windows
OS X
Linux
Solaris/*illumos
*BSD
[more to come, haiku is already in progress, afaik]
And for .net there’s Windows-only, as Mono is castrated to Web mostly, and don’t include wpf and so on, and also there are tons of native-bindings in such popular apps as paint.net. Yes, I forgot about MonoDroid.
Oh.. that’s free now?
One of the least pleasant Java IDE’s I’ve used.
Presentable. Used it mainly for Flash development, but it is pasable on other OS. Not a patch on Sharpdevelop/Flashdevelop though (Both also free.)
As is:
Sharpdevelop/Flashdevelop (similar basis)
Monodevelop (works on all of the major platforms you mention)
VisualStudio Express
No it is not. *sigh* Mno runs on half a dozen platforms, the Micro framework runs on half a dozen embedded boards… there are other implementations that worked (portable.net as an example) that worked on BeOS back before haiku was self hosting.
WPF is not a serious requirement on other platforms. The code I’ve written has always been based on a MVC/MVP pattern with a native widget set, so Mac has MonoMac and Linux has GTK#. You point is pretty moot. Why would one want to restrict apps to using a legacy widget set that is not a good fit for the underlying OS anyway?
Mono also now includes the full ASP.Net MVC3 with razor and all of the Entity Framework… so it is a first class citizen for Web and packs a real punch for a well designed desktop app. But I’m going to be honest… no one is producing desktop apps anymore… not in commercial businesses. Everything we have done for the last 3+ years has been web based.
Right tool for the right job… *shrugs*
After all, Java now is too under just another dying monopoly, and it is safer to use something like Python, especially for web.
VS express is not open, so it cannot be compared to others. And yes, IDEA core is open source under Apache license.
Mono is cross-platform, yes, but it’s not the truly same platform.
Some other worth remembering .Net/Mono projects: http://en.wikipedia.org/wiki/Microsoft_XNA & http://monogame.codeplex.com/
http://en.wikipedia.org/wiki/Microsoft_XNA
http://monogame.codeplex.com/
Free and feature-rich; very multi-platform, across Windows, Xbox, Windows Phone, iOS, OSX, Android, desktop Linux; with many existing examples in appstores.
Edited 2012-12-01 18:50 UTC
Absolute crap. Yes you can make some monstrosities in WinForms … but I could probably do the same with Swing, SWT and other IDEs that have a designer.
It is fine without add-ons, I personally don’t like tham. There is quite a lot under the surface such as T4 templates.
It can be argued either way, however Java is showing it age. Also I Properties and Linq make doing a lot of things very easy that would otherwise needs tons of boiler plate code.
ERR, this silly argument. As long as someone writes reasonably fast code today that is fine.
I dunno how you think Java is more Strict than C# since they are both Statically typed languages.
Edited 2012-11-25 20:32 UTC
Although C# can make use of dynamic types as well.
Yes it can, I tend not to use them though.
I also only use them when it makes sense or is required by some API.
I follow the same school of though as Anders, use static typing everywhere and dynamic only when required.
Specially in languages that have some kind of type inference.
I’m using Jetway’s industrial mini-ITX form factored factored with included CPU and GPU (one with VIA C7 and one with Intel Atom 330) that can be switched in-place at very little cost. If ever one gets out of order.
See there :
http://www.jetwaycomputer.com/J7F2.html (VIA C7)
http://www.jetwaycomputer.com/NC92.html (Atom 330)
Both accept daughter boards for easy expansion. You cen get them from there :
http://www.mini-itx.com/store/default.asp?c=34¤cy=1
Very high quality manufacture, long life capacitor are used. Not strong horse power, but very stable working and reliable hardware.
Next incarnation will be a cluster composed of AMD Fusion APU with OpenCL GPGPU processing shared amongst the whole system. Of course, I bet it’ll be composed of Jetway motherboard due to their reliability :
http://www.mini-itx.com/store/?c=69
I’ve no stock into Jetway, it’s just an honest testimony.
Kochise
Don’t you find them too slow? I’ve got an AMD Fusion E-450 system and it feels really slow to me, and it’s faster than the Atoms and VIA processors you linked to.
I’d think they would be too slow for a desktop. I use them to build Smoothwall boxes. They work great.
Yeah, a bit slow, but low power, you cannot expect much of course. The VIA C7 runs under Windows XP, the Atom 330 under Windows 2000 (used to run under Ubuntu 9). If AMD E-450, it would run under Windows 7 with no problem at all. Sure, no great power under the hood, but great stability, nice engineered hardware.
When I invest into a computer, I do not plan to throw it after a couple of years. My VIA C7 runs from late 2007 (5 years) and my Atom 330 from late 2008 (4 years) with no problem. That’s what I’m looking for : reliability and accuracy.
Kochise
Accuracy? Is it lacking in other and/or more powerful machines? ;P
Accuracy – not over sold with false promises. Do not fail after a year.
Kochise
How about reliability, both in the sense of reliable vendors and reliable hardware ?
Kichise,
I bought a Jetway mini PC for my parents due to the incredibly small form factor. Wouldn’t have bought it for myself due to not having enough usb ports or expansion slots, but gave them a shot.
http://www.newegg.com/Product/Product.aspx?Item=N82E16856107081
It’s slower than I would have liked, but not unexpected for a device in it’s class. The thing that I’m finding is that the little thing runs very hot all the time and the fan’s are very loud. It might be a lemon, but many of the reviews are similar.
This was obviously a bundle with a stock jetway case, do you use something else?
I might give them a try again if I knew they solved the fan/heat problem, but given that the one I bought was speced as having a “silent fan”, I’d have alot of trouble trusting what they say in the future.
That’s the kind of computer I’m now looking for. Sure, not powerful as a Core i7, but for what usage ? I bet that the box heat a little because it’s still a dual core 525 plus Ion 2. Even though low power, still the TDP is about 20W or so. I think you cannot make great things to that extend, my mini-ITX baord are not cased at all, and the fan is a little noisy.
That’s why I’m looking for AMD APU (like the E450) with passive cooling. Strong enough for desktop usage (LibreOffice) and can decode Full HD streams, play DX11 games (with medium details) etc… Far enough, from my point of with according to my usage.
Kochise
My servers are out sourced, cloud. So I don’t have to worry (too much) about its hardware failure. But I do have backups (git checkouts, mostly).
My “desktop” is a macbook, with time machine backup. Never failed, yet, but I know I can rescue it easily on another (mac) device if I need to.
That’s enought for me, and I guess for most people. And a lot less time consuming =)
That’s one of my absolute favorite things about the Mac platform. As long as you have access to a working Mac, you can recover from or repair just about any issue on another Mac, barring total hardware failure. Combine that with Time Machine for backups and you have a nearly foolproof recovery scheme.
There are a ton of reasons I don’t use OS X as much as GNU/Linux or Windows these days, but I still have a Mac mini tucked away as a server and if my laptop ever goes down I can jump on the Mac without missing a beat. My most important small files are duplicated across three online storage accounts and two thumb drives, and the big stuff is on two different external drives. One stays put up at home when not actively backing up, and the other is in a backpack that goes with me wherever I go.
It’s probably overkill, but the best thing about it is all of the recovery solutions I have, and all of my storage containers (online and physical) are cross-platform. I can install a new instance of any of the major OSes and within minutes have access to any file I need.
I’ve been freelancing (web design and development) for the past three years and one of my critical aspects of business is that I must have a machine available at all times. Over these few years I’ve been bitten due to hardware and software failure.
If I see windows misbehaving in anyway, based on the problem, I quickly do a check (virus scan, system restore, etc). If the problem is consistent, I have a hard drive ready with an exact carbon copy of my software setup – I change the drive and continue (windows activation may apply at times due to a simple hardware change).
If the problem after this continues – I continue my work with my pc slate (with hdmi out – very important as the slate screen size is too small for design work) while I run hardware test on my problematic machine. Once I find the hardware issue I’m just a call and short drive away from getting a replacement part. Problem solved.
Throughout all of this, I have a development server running on a mini-itx machine (ubuntu server). If that fails I have xampp on the ready on both machines (main pc and slate) and restore everything from my backups.
Backups, backups, and backups, those should always be off any of my work machines. Have a NAS for this (and going to setup another NAS off-site).
The key element in all of the above is that I did not skimp out on initial investment on hardware and software – furthermore having a good plan in place. Meaning that I did not try to save money for the sake of saving money – meaning buy a brand machine with questionable parts but instead bought each part with some afterthought (the slate and the NAS included).
I’m saving money now because I’ve minimized trouble shooting time to the point that it costs me less to get my self up and running to continue my work and not losing money because I couldn’t work.
Moving to Open Source software for development is now easier than ever thx to the stantars, but, it doesn’t matter if you use open or closed tools, at the end the quality of programmer is what makes the difference.
Edited 2012-11-25 15:39 UTC
>at the end the quality of programmer is what makes the difference.
Yes, I fully agree with this viewpoint. Just in the closed world this quality dies with programmer, but in the open world it lives forever in publicly available well-written source code or in design principles for which no one will sue anyone else.
“The open” does not mean “open source” always – it is the freedom of choice. You choose proprietary software, if it fits your needs. And you choose free software, where it performs better. And the interoperability and interchangeability is by-design.
Even some proprietary companies live such a way. They just do better than others, and don’t sue anyone, and don’t restrict user’s freedom for choice, because they are sure that their products are awesome, and user will choose their products nevertheless. The programmers actively interoperate with users and community, and they are just proud of their work and work even better.
Ever heard of patents and restricted licenses?
Edited 2012-11-26 09:39 UTC
>Ever heard of patents and restricted licenses?
Yes, all the world have a lot of fun watching this soap opera.
This alone proves you have little knowledge of how real commercial development works in the majority of the business world. Open Source simply doesn’t exist in any meaningful way, end of story.
Be aware – open source tools do NOT automatically create open source. e.g. Using Eclipse doesn’t create opensource any more than using Visual Studio. Thinking any different is naive.
Quite true.
It was an eye opener for me to move from the cosy open source world of the university, into the harsh reality of the enterprise and big corporations.
It’s almost impossible to sell the concept of Open Source to a board. “We spend money developing a product, then we give away the source code for free so anyone can build their own copy??? What kind of insane business do you think we are running??” Building a viable business proposition that produces Open Source is incredibly hard – moreover, succeeding with an Open Source business model is exponentially harder than using a closed source one. Not impossible, but most larger companies that do this kind of thing also have other interested (Google, as an example, have the search and advertising business to prop up their many free/open projects.)
>This alone proves you have little knowledge of how real commercial development works in the majority of the business world.
Of course, that’s why the Linux kernel is currently one of the most actively developed widely commercially-used projects. Also illumos kernel is now developed by several corps. And mostly advanced Java-related companies have IDEs with open core. And even Microsoft releases Entity Framework, ASP.NET MVC, and some other important parts of their product line as open source software.
>Open Source simply doesn’t exist in any meaningful way, end of story.
For you – does not. For all those who do not understand that code-sharing is better that inventing the bicycle inside every new company. Major players share parts of their codebase by providing open source under liberate license. Software becomes harder and harder to maintain, and only shared attempts to develop it will lead the players to the victory.
And there are other sides: community growth, larger tester crowd, popularity of the company as liberate and open, and also recruitment – newcomers are already familiar with product’s core when they come to work.
No, I’m not the fan of GPL3-only, etc. Software may have closed components. But open core and good interoperability – they are the main perils of success.
It’s not so good, not so easy and simple; there are factors outside of it.
For example Opera – they’re among the biggest supporters of open web and so on, Opera desktop browser is much less popular than, say, IE.
Only on mobile it is reversed (mobile versions of Opera enjoying huge adoption, IE hardly registers) …more ~social factors and dynamics play a role.
A good, serious article. It’s unfortunate that some of the people commenting here aren’t capable of equally serious criticism and instead resort to juvenile responses. That hasn’t been cool since the 90’s, people.
Though I agree with the article about the shifting sands of Microsoft’s developer environments, the WAMP platform suffers from the same problems! Whichever stack you choose you face a major issue with backwards compatibility if you’re an IT organization. The problem is not the MS stack but instability of all dev stacks.
I agree about the lack of vendor support these days. I still see some hand-rolled fixes by IBM and Oracle but most vendors follow MS on this and have given up on that long ago. Like the quickly changing dev stacks, this makes sense for the vendors but leaves IT customers hanging.
The fact is that it’s just plain tougher for IT to keep up today, esp. given the invasion of handhelds.
Does it occur to you that the technology suitable for you and you mother is not optimal on a larger scale? In fact, small scale solutions rarely scale up. For example building model airplanes is totally unlike building a 787. Building a hot rod is not like building 5000 Fords a week. Do you have any idea what it takes to deploy and support tens of thousands of systems across the world in multiple languages with hundreds of printers and peripherals from dozens of manufacturers. Red Hat of course claims to support this type of environment but then you a just trading a little Microsoft for the big one.
You probobly mean scalabity, in which GNU/Linux is probobly the best in the whole market [as the whole UNIX ecosystem]. From small ARM/MIPS, etc devices, to big servers running big CPUs, or ARM/MIPS. There is actually no other OS that would give you such outstanding scalability OOTB.
The thing you are talking about is rather a commercial support, in which case I can agree, but it’s getting better and better over time in FLOSS world.
Some people say software support is not so good on the OSS side. Myself and my company we use a distribution called OS4 and its just as good as Windows and Red Hat Linux. The support side of its awesome. We have a support subscription for that company PC/OpenSystems LLC and they are wonderful. We havent run into a problem they havent been able to fix. The guys are courteous and they are just wonderful. I recommend them to anyone. http://www.os4online.com
If you want to have really high availiability of hardware/software then you should detach yourself from any particular vendor. Your environment should be totally independent from underlying platform.
That is, if you are thinking about recovery tools – you are in the wrong path. If you think about mobo replacability you are on the wrong path.
True availiability comes when you can replace your work machine with absolutely any box available regardless of the HW/SW in that box. Of course you need to set some minimums, like 4+GB RAM and 300+GB HDD but thats it.
How to achieve this? You create virtual image with your development tools and put it in the network storage. On your host computer you install your virtualisation software and off you go. You can replace any part of the underlying hardware in an instance without caring what failed and why.
What needs some investment is the backend – you need some sort of high-availiability network storage like NAS/SAN or something where it is proven, that if you pull out arbitrary hardisk and replace it with empty one, no data is lost.
That is the only way to get high availability workstation. You should not care about underlying HW nor SW. Operating system is totally irrelevant in this case.
I’d agree with the most things you wrote, however:
– there are performance issues [still, even though we have powerful CPUs]
– there are security issues [underlying OS should not be susceptible to malware attacks/infections. This automatically excludes Windows ecosystem]. Besides, VMs are not the ideal solutions when it comes to privillege escalation and access to the host system [which is definitely possible and exploitable]
– NAS/SAN IS in fact a hardware [and software] investment anyway, so this kinda stands in a way here for you. However, regular PC should be enough to use such VM, as long as the host HW has strong VM capabilities [which most of it actually has now].
Anyhow, I like the whole concept.
Regards
That is a load of crap, fair statement in the year 2000 … but today it is utter rubbish.
I like the approach in the article but I think he could have pushed it further to its natural conclusion.
This would mean having all your work in a virtual machine guest OS. Then just back up and propagate that guest OS onto different hosts as needed.
This completely decouples your working environment from the underlying hardware. It’s easier than the more complicated approach the article suggests.
You can also make project-specific VMs, which I do with some regularity.
When the project ends, you just shelve the whole thing. If there is a bug reported, new feature needed, etc, you can go right back to it, even years later, with everything just as you left it.
Of course, you still need some sort of OS and hardware to run your VMs, no matter how thin a layer it is, so you cannot really escape the cycle entirely.
“project-specific VMs”
I like it, good idea. I think maybe that’s the next logical step for me. Thanks.
As you say, ultimately there is a hardware connection or dependency somewhere. But if you could disguise it and decouple as much as possible it’s a good thing.
+1
All of my dev work is via VM’s these days. The VM is transportable, and I can use it on any OS I choose (so, my Work Win 7 laptop runs it most days, but my Mac Mini also gets a look in.)
…I am only using Linux and FLOSS in my new company.
The main quibble I have is that the quality of entry-level stock components has gone down over the years and the failure rate has gone up. Unless you either have a great relationship with a local hardware dealer (and these are becoming increasingly difficult to find) or have someone on staff who can invest the required time and effort to ensure that you’re sourcing good components, this business strategy could wind up soaking up an awful lot of otherwise billable time.
The only other issue I would raise is that using desktop Linux as part of a small-business IT strategy, even a business that does IT, is going to require more training than most computer users are going to willingly undertake. Most computer users–including IT professionals– are basically like most automobile users. They know how to put in gas and use the steering wheel and are otherwise quite content to remain in ignorance of technology they depend upon.
Although the actually usability gap between Windows 7 and the most popular currently Linux distros is virtually nonexistent, stuff happens. When stuff happens in Windows, people shrug and accept it. When stuff happens in Linux, the typical response is a tirade about how crappy Linux is. Even if you’re in a position to insist that your consultants are Linux-knowledgeable, I foresee needing the one box running Windows (probably w/a QuickBooks license) for whoever is answering the phone and keeping the books.
That’s true for all of the technology we use. You might know how to compile a kernel, figure out which kernel module you need to get the sound card to work correctly, or be able to diagnose that the constant dropping of the wireless signal is based on a buggy driver that hasn’t been updated yet by your distro of choice (and when you bring the problem up someone invariably offers a “superior” distro that you should switch to).
But could you go through the driver code, line by line, and solve the problem? Could you design your own PCIe card? Have any idea of what the individual traces do? Or the SATA signaling, what the pre-amble on the SATA command is for (or what it consists of)? Could you re-solder a cracked motherboard (or even know how the different trace lengths might affect timing?)
We all have a demarcation point with technology. There is a huge (quite literally) unfathomable, by a single human mind, amount of complexity hidden from us in the technology that we come to rely on, that not only do we chose to ignore, but we couldn’t effectively use the technology without most of it being hidden.
But that’s the point. The more of the technology that is hidden from us, the more useful it is. Computers used to be programmed by machine language, then assembler, and then C and others compiled languages, then the scripted languages. Every layer we bury from site means we’ve reached a new level.
There will always be a need for those that understand the deeper levels (and those people will be highly valued), but it’s not necessary (nor practical) that we all do.
I put gas in my car, and my car’s computer tells me when to change the oil and perform other maintenance. That’s fine by me, because I use it to get around. The workings of it don’t interest me, but I enjoy the benefits that it brings. Baring a zombie apocalypse, I’m fine with that relationship. There are people that love to tear down engines, rebuild… whatever in an engine. And that’s great. But thankfully today, you don’t need to know that to own a car.
tony,
I agree with you & the OP. Generally most people don’t need to know the low level details, and that’s a good thing because it makes us more efficient and less distracted.
“The more of the technology that is hidden from us, the more useful it is.”
My own view though is that the low level things should remain out of the way, yet accessible for those of us who’d benefit from writing/installing third party modifications. We’re seeing many modern platforms simply cutting off access to low levels. That’s a big problem because it represents a growing inequality regarding access for developers/engineers who’d otherwise be able to further drive innovation.
It’s possibly perceived as much easier and/or cheaper (also with support, when some/many “average users” ~accidentally go too low) to lock things down …or maybe the idea, for parent companies behind some platforms, is to not have too many outside devs able to compete with them?
Edited 2012-12-01 09:59 UTC
And before that, re-plugging cables or binary swithes manipulation.
I wonder what is the next level…
PS. Maybe distribution of task-specific VMs, all that is needed nicely included and not much else? (versus recent projects like RPi which seem to focus on hardware more – so a bit stuck in the past; of course, RPi is genuinely useful for many things …but one goal – offering safe way to experiment with OS & programming while isolating potential damage – can be nicely covered by VMs)
Edited 2012-11-28 10:30 UTC
The whole point of the RPi is to bring back the days when students came into university Computer Science programs primed with deep knowledge of the system.
That means four sub-goals:
1. Convince parents it’s safe to let the kid tinker like mad (tricky with a VM)
2. Let the kid explore as deeply as they want (tricky to give the feeling of with a VM)
3. Give the kids something to interact with the real world in fun ways like the GPIO header on old Commodore and BBC Micro computers. (impossible with a VM)
4. Convince schools to have a ready supply of them. (Easier when you satisfy the first three goals and offer it cheaper than the machines to run VMs on)
Not to mention that you always feel happier about something when it’s your own personal thing rather than something to share with your parents, brothers, and sisters.
The RPi’s price point also gives schools the option to say “Give us $35 and you can take it home to play around with and keep it when the semester is over.” (Plus whatever the SD cards cost in bulk, of course)
I remember hearing Eben analogize it to giving the kid a bike rather than letting them muck around with the family car.
Edited 2012-11-28 10:44 UTC
Not being funny, but the other day I install Fedora 17. Booted from USB stick, I had an error that basically stopped the live distro to boot up, google the problem and there were clues on how to fix the problem. I made a guess that I had to remap the UUID of the USB drive in Grub so I resorted to writing UUIDs of disks down (one was 16 characters long and I think it was my SD card).
The thing is that with a lot of Windows errors there tends to be a work around … when Linux dumps you at a terminal with a cryptic error message or something just offers no output after erroring out(usually GUI apps that are basically a front end to the CLI) … it does come somewhat frustrating.
In reality there are very few Windows errors now that aren’t friendly.
lucas_maximus,
“Not being funny, but the other day I install Fedora 17. Booted from USB stick, I had an error that basically stopped the live distro to boot up, google the problem and there were clues on how to fix the problem. I made a guess that I had to remap the UUID of the USB drive in Grub so I resorted to writing UUIDs of disks down (one was 16 characters long and I think it was my SD card).”
I’ve had long standing issues with Grub on removable media. Infuriatingly they didn’t fix this with grub2. The partition map grub used was probably incorrect for your system, and grub stabs around cluelessly loading from arbitrary drives. If you manually fixed the UUID, you should also check that it’s loading the right kernel as well. It might still be using a kernel on your hard drive and just using the UUID to boot the distro on your sd-card.
Is this an image you compiled yourself? Most linux live boot disks use syslinux instead because it doesn’t get confused about what media it needs to boot off of.
“The thing is that with a lot of Windows errors there tends to be a work around … when Linux dumps you at a terminal with a cryptic error message or something just offers no output after erroring out…”
There’s no excusing the grub problem you had, but I don’t think your generalisation is fair. Sometimes the easiest path to fixing a windows problem is to reinstall it.
I actually disagree unless the machine is or was virus infested (there is no way to say there is something the AV didn’t catch).
A lot of Windows problems can easily be fix by either looking through the event viewer, testing the memory (I use the Windows Memory tester CD … seems to work fine and not as verbose as memtest), or using tools such as CCleaner and the task manager.
Windows isn’t a magic box that happens to work until it stops working.
lucas_maximus,
“A lot of Windows problems can easily be fix by either looking through the event viewer, testing the memory (I use the Windows Memory tester CD … seems to work fine and not as verbose as memtest), or using tools such as CCleaner and the task manager.”
Sure, but the implication is that linux lacks the tools to debug itself as well, which is untrue. If that’s not what you meant to imply, then please clarify.
“Windows isn’t a magic box that happens to work until it stops working.”
Neither is linux, for that matter. Either platform can run reliably for ages.
No, is not a magic box.
(Trollmode ON)
But Windows is like an American car, which need tune up every 3 months or it breaks appart, while Linux/BSD are European cars which runs without tuneup for 6+ months, lol
(Trollmode OFF)
Which European cars? If they are french cars or Italian Motorcycles expect them not to work if it rains (I don’t even joke, I have owned a Harley Sportster 1200 and a Aprilia RS-250, one is European and one is an American/Euro engine and it doesn’t start easily if it is raining).
Actually Windows since Windows 7 is more like a Honda CB-125 motorcycle engine. You can abuse it as much as you want and it will keep on working … it might limp along and not get you there as fast … but it still works
😛
When did I say that, I simply said that with Windows that problems can be solved using those tools … and things can be solved without a reinstall with a little understanding of the underlying system.
True.
My comment is more about the fact that people tend not to look for the root cause of the solution.
A lot of Linux users “distro hop” because they can’t solve their problems using <distro X> and try using <distro Y> in hope it will solve their problems.
When in fact there is an underlying issue that needs to be addressed.
Either way reinstalling or installing another operating system is not the way to solve problems. It is better to scrutinize the root cause of the problem.
In short it is better to understand why you are having problem first before deciding a strategy to deal with it.
Edited 2012-11-27 20:17 UTC
lucas_maximus,
“When did I say that, I simply said that with Windows that problems can be solved using those tools. Nothing more.”
Fair enough, I was just reading your statement in context of the previous windows/linux generalisation.
Another tool everyone on windows should have is the sysinternals process explorer. Shows processes, resources, open files, etc. It allows you to forcefully kill hidden processes that are otherwise difficult to kill and responsible for locking files.
http://technet.microsoft.com/en-us/sysinternals/bb896653
Edit:
” … and things can be solved without a reinstall with a little understanding of the underlying system.”
This is assuming you can track down the problem of course. I’ve had many people come to me for help in fixing something or other on windows. Usually it’s something pretty trivial, but other times I have to recommend restoring or reinstalling because it’s less work than trying to find out what’s wrong. Sometimes just re-creating a user account is enough.
Mind you, linux can be the same way, but it’s usually easier to diff the files and/or look at timestamps there. In windows you’ve got the whole registery to deal with.
Edited 2012-11-27 20:38 UTC
For that specific usage, Unlocker is IMHO much more handy: http://www.emptyloop.com/unlocker/ – nicely integrates into right click menu; and can itself perform actions on the file, which avoids locking the file again by explorer.
PS. Hm, and apparently it has become prominent enough to have its own http://en.wikipedia.org/wiki/Unlocker page.
Edited 2012-11-29 07:40 UTC
I think that’s looking at the past through somewhat rose-tinted glasses. Computers of the past were often notoriously unreliable (remember 8bit micros?)
And/or: entry-level components in the PC world have now much lower absolute price; So not that bad of a trade off.
Around the year 2000 I was in a company that while the computers where not 100% generic, they were some Dell desktops that were similar in specs and were close to generic as possible. The only proprietary thing they had was the case design in front, the Dell badge, and their Phoenix BIOS. Hardware side they’re very generic. We used to run then on Win 98, and had an image made with PQDI to restore any broken system in a couple of mins.
In terms of software the problem is that many corporate environments are tied to MS due to many factors and still on this year most people can’t even think on systems running anything else other than Windows or Mac OS.
A person mention the problem of MS Office having rendering issues with itself, the problem was (and maybe still is) that the document rendering depends on the output device (printer). If a person created a document while having a dot-matrix/inkjet back in the nineties, when getting the document to a machine with a laser, the document was reformatted and a couple of things went misplaced or out of margin.
Since these days most printers offer at least 600 dpi is far less common for that to happen.
Hard to call Phoenix BIOS non-generic? I mean, I bet the computers of article author also run proprietary BIOS …doesn’t mean it’s not generic.
My requirements is for an OS where I can actually have the microphone jack mute the speakers on a laptop without dicking around on the command line for two hours… only to then find out that when ‘fixed’ it breaks hibernation. My requirements is for multiple display support that actually lets applications be aware *SHOCK* of screen edges and the concept of a ‘primary display’. My requirements is for a clipboard that doesn’t mangle things across applications every five seconds due to there being some twenty different clipboard formats and controllers, to the point you can’t even hit copy in a browser and paste in GIMP without it mangling the colorspace. My requirements is to not have to dick around with config files just to unlock resolutions other than 800×600 or worse, have to force it into 16 bit video mode just because the open source drivers get down on their knees in front of the proverbial equine. I require LEGIBLE fonts that are more concerned about making sure I can read it than they are that the ‘glyph is properly formed’ and that doesn’t kern text like a sweetly retarded crack addict. My requirements include that my GTX 560TI is fully supported and doesn’t behave like the Ge8800 I threw in the trash six years ago.
But more importantly, I require applications that actually WORK and provide full functionality, instead of the pathetic crippleware that LOOKS good, but is like a trip in the wayback machine to windows 3.0! Worse most tasks involve so much screwing around on the command line for mundane stuff I might as well throw my I7 machine in the trash and drag out the Trash-80 Model 12 for all the ‘improvement’ it offers.
Which as a desktop OS, I’ve yet to see ANY *nix (even OSX) provide. *nix is great for servers, keep it there. For my desktop, it can shtup right off! To be brutally frank as a desktop OS, the slate of different WM’s for X11 haven’t even caught up to Windows 3.1 in functionality apart from long filename support.
But what do I know? I consider Windows 98 to be the pinnacle of computer UI design and most everything since to be pointless idiotic changes that make it less functional — but at least with Win7 and earlier I can turn all the idiocy off.
Which is why Win8 unless you hack the hell out of the registry and load up on third party software ends up lumped with every other blasted OS in “welcome to the worst of 1994” usability.
Edited 2012-11-27 19:23 UTC
This is just a little mistake in the details:
Microsoft Office is available for the Mac from Microsoft*. But I guess that is also an exception.
* but I think it is actually ported and worked on by an external company ?
Edited 2012-11-29 10:34 UTC
No, Microsoft does it.
And it’s not the only example of beyond-Windows products from Microsoft – http://en.wikipedia.org/wiki/Xbox_SmartGlass can be used just as well with iOS or Android handsets. Also: Skype …it even has desktop Linux version, with ongoing development.
Edited 2012-11-29 11:28 UTC
I might be wrong, but I specifically remember there is a company where some employees go to the Microsoft buildings every day and work on it there.
They bought Skype, we’ll have to see what happends, won’t we ?
You likely remember wrong and/or something else. Also keep in mind that Office for Mac is one of the oldest active MS product lines.
And yes, Microsoft did buy Skype some time ago already and nothing bad happened to it. Nothing is going to happen WRT to Skype multi-platform nature, it would
jeopardize its status as the largest international voice ~carrier. Also, what happens now: Microsoft is retiring Windows Live Messenger in favour of Skype, likely wishes for Skype to become the IM client.