|May the source be with you, but remember the KISS principle ;-)|
|Contents||Bulletin||Scripting in shell and Perl||Network troubleshooting||History||Humor|
With the introduction of Windows XP Service Pack 2, Microsoft introduced the Security Center.
Disabling The Security Center
Let's get this one out of the way first. To disable the Windows XP Security Center open the Services MMC:
Press the Windows and R keys > in the resulting window type services.msc > in the resulting window scroll down to Security Center and double-click on it > In the "Startup type:" box use the arrow to scroll down to choose "Disabled" click Apply and reboot, or, click Apply and click the button that says Stop.
Configuring the Security Center
Before you go any further...
-Run Windows Update.
-If you are using a 3rd party firewall, update it now.
-If you are using an Anti-Virus program, update it now. If you aren't, shame on you, get one! (I use AVG.)
To open the Security Center open Control Panel:
Press the Windows and R keys > in the resulting window type control.exe > in the resulting window double-click on the Security Center icon.
[Oct 28, 2004] Several useful Microsoft papers about setting more restrictive Unix-style (root vs regular user) permissions in windows XP and 2000. The easiest way is to set them is via utility SHRPUBW that one can from RUN menu.
Reflections on Witty by Nicholas Weaver and Dan Ellis. Very good.
On March 20th, 2004, an attacker released a single-packet UDP worm, Witty, into the wild. Although only infecting roughly 12,000 machines, and less than 700 bytes long, this worm represents a dangerous trend in malicious code. The attack is well understood: there have been several analyses [lurhq, disassembly] of the worm itself, and an excellent analysis by Moore and Shannon on the network propagation [caida_witty], including the presence of seeding or hitlisting (starting the worm on a group of systems to speed the initial propagation). But what can we learn about the attacker?
Examining the timeline of events, the worm itself, its malicious payload, and the skills required all point to a sophisticated attacker. Witty was written by an author who was motivated, sophisticated, skilled, and malicious. Although there have been previous well-engineered worms (notably the Morris worm and Nimda), Witty represents a dangerous new trend, combining both skill and malice.
It's actually unfortunate that Witty hasn't gotten the attention lavished on previous worms, as it was a very significant attack. This worm contained a payload malicious to the host computer, was released with almost no time to patch systems. The worm contained no significant bugs, and was written by a malicious author deeply familiar with the theoretical and practical state-of-the-art in worm construction and computer security.
"Once you start down the road with that analogy, you get stuck in it," said Scott Charney, chief security strategist for Redmond, Wash.-based Microsoft.
Charney says monoculture theory doesn't suggest any reasonable solutions; more use of the Linux (news - web sites) open-source operating system, a rival to Microsoft Windows, might create a "duoculture," but that would hardly deter sophisticated hackers.
True diversity, Charney said, would require thousands of different operating systems, which would make integrating computer systems and networks virtually impossible. Without a Microsoft monoculture, he said, most of the recent progress in information technology could not have happened.
Another difference: computers can be unplugged from the network and rebooted; organisms cannot.
The theory also has skeptics outside of Microsoft.
Security consultant Marcus Ranum has emphasized that many network threats have little to do with the vulnerabilites of monoculture. Planting three strains of corn offers insurance against some diseases, he notes, but without a fence, deer will eat all three.
But Ranum also says the monoculture story "would barely be news" if @stake "hadn't done a brilliant surgical marketing strike on its left foot by firing Dan."
At an October hearing of the House Government Reform Committee (news - web sites)'s technology subcommittee, Steven Cooper - the Homeland Security Department's chief information officer - was questioned about the federal government's vulnerability to monoculture.
Cooper acknowledged it was a concern and said the department would likely expand its use of Linux and Unix (news - web sites) as a precaution.
The monoculture idea is also influencing how experts look for solutions to security problems.
Mike Reiter of Carnegie-Mellon University and Stephanie Forrest, a University of New Mexico biologist who has been gleaning lessons for computer security from living organisms for years, recently received a $750,000 National Science Foundation (news - web sites) grant to study methods to automatically diversify software code.
Daniel DuVarney and R. Sekar of the State University of New York-Stony Brook are exploring "benign mutations" that would diversify software, preserving the functional portions of code but shaking up the nonfunctional portions that are often targeted by viruses.
Which Culture? (Score:5, Interesting)
by smccto (667454) on Monday February 16, @11:07AM (#8294315)
Monoculture or Diversity?
The AP ran a story this weekend, captured by Yahoo [yahoo.com], talking about Dan Geer and his thoeries of how the Microsoft Monoculture endangers computer security. I have concerns.
Although I know this won't fend off the zealots who just need to speak their mind, else their puny little heads explode off of their shoulders, atrophied from lack of lifting their hands any higher than a keyboard, I offer this caveat: What I'm about to present is merely philosophical rambling, curious wonder, nothing more than an innocent what if. It is, in no way, intended to offer an argument, solution, opposition, or anything else that would offend (other than those puny headed, shoulderless freaks).
Just the facts, Mam
I found it intriguing that, as the AP article mentioned:"Steven Cooper, the Homeland Security Department's chief information officer... acknowledged [monoculture] was a concern and said the department would likely expand its use of Linux and Unix as a precaution."
Why hasn't Mr. Cooper, the media, and suposed security experts who promote U/Linux as a safe alternative, acknowledge that U/Linux also have their share of security advisories? Take a look at Secunia [secunia.com] and their product listing [secunia.com]. Doesn't anyone care that Solaris 9 [secunia.com] had more advisories (42) in 2003 than Windows 2000 Server [secunia.com] (36)? Doesn't it scare anyone that, while Windows XP Home edition [secunia.com] had 32 advisories, Red Hat 9 had more than twice as many with 72? Debian 3 [secunia.com] had 186!
Doesn't Open Source claim [devx.com] to have a better development model by throwing more eyeballs at the source code, thereby eliminating - or minimizing - security flaws earlier?
Missing the forest for the trees
Take a look at this, also from the AP article:
"Mike Reiter of Carnegie-Mellon University and Stephanie Forrest, a University of New Mexico biologist who has been gleaning lessons for computer security from living organisms for years, recently received a $750,000 National Science Foundation (news - web sites) grant to study methods to automatically diversify software code.
Daniel DuVarney and R. Sekar of the State University of New York-Stony Brook are exploring "benign mutations" that would diversify software, preserving the functional portions of code but shaking up the nonfunctional portions that are often targeted by viruses."
Are these people frickin bonkers? We're barely capable of securing the simplest SMTP and FTP services. Software is already beyond our comprehension [sun.com]. What makes us so arrogant as to assume we can write software that makes other software more secure - without breaking it, without opening unforseen security breaches? We are decades away from being that intelligent.
Of course, on the plus side of this approach, as software gets more complicated, it will be too obfuscated for the Puny Heads to understand and, therefore, will be a great deterrent for attacks! (Yeah, sarcasm)
Dan Geer likes to compare the information world to that of biology, equating computer viruses with biological viruses. I have one problem with this way of thinking. Biological viruses simply exist, have always existed and will always exist. They don't have an agenda. They don't have malicious intent. They aren't scheduled or targeted. They are nature. It's the way the system works. The global ecosystem is self-maintaining, self-cleaning and always changing. No good or bad, just the way it is (unless of course, you're one of the organisms destined for cleaning).
How does this differ from computer viruses? Choice. Without getting theological, biology evolves without thought; there is no such thing as a coordinating strategy for doing something that wouldn't have otherwise happened naturally. Computer viruses, on the other hand, are man-made. They are imagined, planned, manufactured and deployed with malicious intent. The designers have made a conscious decision to do something widely recognized as bad.
Dan Geer's culture is that of flowing electrons. It has nothing to do with socio-economic diversities, geopolitical bents or why 27 year old boys who can't find a date would rather simulate guerilla combat from the confines of their childhood bedrooms. His views are, much like those of a typical radio talk show host, are miopic, prejudiced and, at the very worst (and quite possibly more dangerous) subjective.
The culture that we should focus on is that of the people responsible for software. Microsoft has taken the initial step to change their internal culture. Their mandate to write more secure software is a good step. You may call it mere marketing, but as long as it actually churns out safer software, then who cares - we all win. We all still wait, with baited breath, to experience the result of their efforts. If they fail in this, if Longhorn has even a single critical advisory within its first six months, then Microsoft is doomed.
I suppose it's wrong to mention... (Score:5, Interesting)
by prisoner-of-enigma (535770) on Monday February 16, @09:16AM (#8293296)
...that Greer's against monoculture but doesn't explore the effects of what would be needed to overcome that monoculture.
As outlined in the article (assuming anyone reads it), critics of Greer point out that simply adding a new OS into the mix (dare I say Linux?) wouldn't substantially help. You'd have a duoculture instead of a monoculture. How much more difficult would it be for hackers to create a devastating hack? It even extends beyond OS's. Apache has the majority market share for all web servers worldwide. What affect would a devastating Apache exploit have on such a near-monoculture? Nobody wants to say anything about that, though, because Apache represents the side of good and Microsoft is evil.
To truly achieve the technological equivalent of biodiversity, we'd need hundreds or thousands of OS's and differing applications. The complexity of trying to get all that crap to work together would be impossible, especially since convergence of any two app's/OS's would be actively discourages to prevent cross-pollination-type attacks.
It's all well and good to bash Microsoft's monoculture. I'm sure there are many here who'll do nothing but that. However, defining the problem is only the first step; you must present a practical, workable solution. Just saying "Linux will fix it all" simply replaces one monoculture with another. But I bet most people here haven't thought that far ahead.
Re:I suppose it's wrong to mention... (Score:2)
by prisoner-of-enigma (535770) on Monday February 16, @11:19AM (#8294466)
As a first step, I would suggest that everyone using MS operating systems stop using Outlook and IE.
This alone would practically stop 95% of all Internet-based attacks aimed at Windows machines. Which again goes to show that it's not so much the OS that's at risk as it is the applications.
As far as integration goes, I think HTML and HTTP, TCP/IP show how easy this can be if we can some up with standards for data formats and transmission protocols.
I disagree. These protocols do very simple things and none of them are secure. Look at the current problems we're having with SMTP mail. It is an inherently insecure protocol that offers no integrated method to determine the authenticity of the sender, leaving the way open for massive reply-to-spoofing spam companies like we have today. TCP/IP doesn't handle security, either, and neither does HTTP (HTTPS excepted, of course). HTML is still far more limited than even a garden-variety word processor when it comes to displaying complexly-formatted documents. You're giving examples of simple components like nuts and bolts. I'm talking about the whole machine.
thousands of OSes aren't required (Score:1)
by cdhowe (738664) on Monday February 16, @12:24PM (#8295164)
Actually, there's research and literature that examines how big an "N" you need in N-version software diversity for survivability, and it isn't thousands; in fact, many operational high-reliability systems actually only use two versions of software (the space shuttle's computers are built this way as are some aircraft systems). So the comment of needing thousands of OSes really isn't true.
I've been surprised at how much heat and how little light (as in research light) has been applied to this argument. Dan's diversity argument is on pretty solid ground in the research community. As an example, here are a set of papers nicely compiled by the City University of London's Center for Software Reliability [city.ac.uk] on fault tolerance, and there are quite a few citations on the use of diversity in software. If you don't like the University's papers, you can find similar papers published by the ACM and IEEE, These might help readers with deciding which point of view is best supported by research. Diversity isn't a slam dunk (lots of nasty details to get right), but it's certainly well-examined ground for high-reliability systems, and a lot of folks are now looking how you apply these same principles to commercial, off-the-shelf systems.
A final thought: the Internet itself is one of the best examples of such a diverse system. At one point, no RFC was ever approved without two independently-developed implementations of the standard. It's one of the reasons it has worked so well and evolved so well over the last 30 years or so.
Re:The trouble with diversity (Score:4, Insightful)
by rqqrtnb (753156) on Monday February 16, @09:24AM (#8293352)
Start by Installing a stable, easy to use and secure Linux distro. So.. In order to be diverse, everyone must use Linux. Aparently your dictionary has a different definition of diverse than mine. Hackers are about to make it even easier for you to be flattened by a virii attack now that Microsoft source has been leaked to the entire world. Exactly how is "Windows Source available on the internet" more dangerous than "Linux source available on the internet" ? The problem isn't that Microsoft software has security issues. All the OS's have 'em to some degree. The problem is exactly "monoculture". One bullet kills all. I'm more of a mind that companies need three operating systems. ... Call them Alpha, Bravo and Charlie to avoid the existing OS arguments. Alpha runs on the corporate web servers, ftp servers and in general anything hooked to the outside world. Bravo runs on the intranet servers that provide file storage, user authentication, etc etc. Charlie runs on the employee desktops. Thus any virus that targets the public layer (Alpha) won't effect internal operations. Any virus that targets the workstations (Charlie) won't spread to the intranet servers (where important data should be stored, and regularly backed up) and any virus that targets the intranet servers (Bravo) needs to get past the other two (Alpha and Charlie) -- or introduced directly -- to be a threat.
Not monoculture, just laziness... (Score:5, Interesting)
by pandrijeczko (588093) on Monday February 16, @09:30AM (#8293390)
Is it just me or do all these pro- and anti-Microsoft "prophets" seem to be missing the point entirely?
The Internet is created on a suite of open protocols that were originally designed for academics & research people to use. Go back 20-odd years and there were no issues of security because only a select few had access to computer networks. Consequently, there was no security built into TCP/IP because there was no need for them.
Now we have a situation whereby if you are a sensible & knowledgeable computer type, whether you use open or closed source software, you can make a pretty good job of securing computers for the Internet - sure, you probably have a reliance on getting the latest patches, putting in a firewall or two but you can do it. No computer is ever fully secure but you can make it enough of a challenge so that the 99.9% of script kiddies give up trying to crack it and the other 0.1% of knoweledgeable crackers probably don't want to waste time with your little box anyway.
Then onto email viruses... Knowledgeable computer users don't suffer from email viruses because they either use email clients that can't execute attachments or they set their machines up so that they know when and when not to run attachments - probably by simply looking at whether or not the sender of the email is to be trusted.
So, in summary, I see this as two core issues, nothing more:
1. Hype and marketing - Microsoft and other software vendors need to step away from the "sales speak" and simply not be allowed to tell Joe Public that PCs are "easy to use" or "secure". It's no different to reminding people to watch their speed and check their tyre treads on a new car, after all... Where are all these "advertising standards" groups that are supposed to ensure adverts convey truth, not lies?
2. User laziness - Joe Public needs to get off his backside and learn how to use the Internet properly and how to secure his PC - again, no different to spending time and money in learning to drive. Far too many people, taken in by the glossy adverts and hype, just sit back and expect software vendors to take away all their responsibility away from them because they themselves simply cannot be bothered.
What really annoys me about this whole issue is that software (and hardware) companies are only going to react to security issues in their products in a way that makes them more money. If the vendor already has his boxed software on the store shelves, he really has no incentive to employ people to work on further security for his products unless his reputation is so bad that he is forced to improve his software at the risk of losing sales - and you only have to look at Microsoft's currently poor reputation and their actual focus on security to see how far down that reputation must go before any action is taken...
However, on the other hand, DRM can be sold as a security-improving product on the back of peoples' fears of Internet viruses while allowing the Microsoft and others to make money licensing DRM.
I wish people like Dan Greer would focus more on the ultimate impact of letting Microsoft "take the blame" only to have Microsoft respond with a technology that will make them more money and cut off our freedoms in the process.
is not monoculture, is evolution. (Score:5, Insightful)
by cabazorro (601004) on Monday February 16, @10:21AM (#8293841)
(Last Journal: Thursday April 24, @11:28AM)
Q:What is the single protocol used by all computers
connected to Internet in the world?
Q:What is the single mail protocol used by all computers connected to the internet?
Q:What is the single protocol used to search the Internet and exchange most information over the Internet?
According to evolution, diversity is the consequence of adaptation.
Specialization, Mutation, Adaptation.
Adaptation is the consequence of a changing environment. A changing environment is the consequence of a finite amount of resources and competition. The Internet in it's current stage resources are plenty and competition is little. Internet is currently in the specialization stage. The Internet has not being forced(YET) to depart from it's standard protocols (mutate) to survive an attack. Forcing diversity (by mandate rather of natural competition) not only makes the system less robust, it slows down evolution.
Many corporate networks suffered because they are essentially asymmetrical: they check only incoming mail stream for viruses, not outgoing mail stream. Internal SMTP servers present a definite risk even if the company uses Lotus Notes. Typical pattern was that first an external gateway stopped responding due tot he volume of connections from infected PCs. After the number of infected PC inside the corporation reached a critical point the internal SMTP server crashed.
- To: firstname.lastname@example.org
- Subject: MIME::Tools Perl module and virus scanners
- From: "David F. Skoll" <email@example.com>
- Date: Mon, 3 Jun 2002 16:19:58 -0400 (EDT)
Background ---------- MIME::Tools is a very nice Perl module for parsing and constructing MIME-encoded mail messages. The latest stable version is 5.411a. MIME::Tools works very well on valid MIME messages. However, there are a number of problems if you use it to implement server-based mail scanning. Problems -------- Problem 1: RFC 2231 encoding not supported. http://www.ietf.org/rfc/rfc2231.txt specifies (yet another) way to encode filenames in MIME messages. MIME::Tools will not correctly recognize this attachment as "foo.exe": Content-Disposition: attachment; filename*1="foo."; filename*2="exe" Problem 2: Rejection of "obvious" interpretation of malformed MIME. The following MIME header is valid: Content-Type: application/octet-stream; name="bad boy.exe" But this header is not: Content-Type: application/octet-stream; name=bad boy.exe MIME::Tools interprets the name field as "bad" in this case, and throws away the " boy.exe" part. Unfortunately, most Windoze mail clients make the "obvious" interpretation and recognize the name as "bad boy.exe" Problem 3: Incorrect concatenation of encoded MIME words. MIME::Tools does not remove the space from this example: (=?ISO-8859-1?Q?a?= =?ISO-8859-1?Q?b?=) to yield (ab); instead, it yields "(a b)" Some MUA's use encoded MIME words in the Content-Type or Content-Disposition fields. Although this is specifically disallowed by RFC 2047, again, some Windoze mail clients may make the "obvious" interpretation and decode the words. Summary ------- Problems 1 and 3 are real deficiencies in MIME::Tools. Problem 2 is not a deficiency in MIME::Tools itself, but that's cold comfort if a virus slips through your server-based scanner. Patch ----- A patch which corrects problems 1-3 and does not break any MIME::Tools regression tests is at http://www.roaringpenguin.com/mimedefang/mime-tools-patch.txt Caveat ------ I make no guarantee that the above patch will catch all forms of malformed MIME which could be interpreted differently by an MUA. In fact, I'm willing to bet there are lots of ways to evade server-based scanners using MIME::Tools or practically any other MIME scanner. Users of MIMEDefang ------------------- If you use MIMEDefang (which uses MIME::Tools), you may want to unconditionally call action_rebuild in filter_begin(). This forces the MIME message to be rebuilt by MIME::Tools, resulting in a valid MIME message. This should guarantee that the MUA interprets the message exactly as MIME::Tools did, but it may introduce unacceptable processing overhead. Vendor Status ------------- firstname.lastname@example.org contacted 30 May; no response yet. -- David F. Skoll
If you're using Trend Micro FileScanner this posting from Frank Hauptle should be worth reading for you
Freebyte's Guide to Free anti-virus software
Re:A review of a service pack (Score:5, Insightful)
by oogoliegoogolie (635356) on Monday January 05, @09:08PM (#7887086) If the free anti-virus you're using is AVG [grisoft.com], you're asking for trouble.
I don't know, maybe Grisoft's retail version may be good, but about a year ago I downloaded about a dozen viruses just to see how well the free AVG Antivirus version, McAfee, & Norton detect them. Although far from an exhaustive test, AVG missed about a third of the viruses, but Mcafee & Norton caught every one.
Free is good, but sometimes you do get what you pay for.
Re:A review of a service pack (Score:4, Interesting)
J. T. MacLeod (111094) on Monday January 05, @11:13PM (#7887970)
Working for an ISP, I've had the exact opposite experience: AVG would pick up the viruses that the Big Two missed.
In fact, I've so far not found an instance where one slipped by an up to date installation of AVG. The caveat is that it isn't so good at deleting files which need permission changes, nor is it very good at neutering the viruses it's unable to delete.
It's what we recommend to our customers. Then again, we can't recommend anything commercial to our customers, because they'll never install something they have to pay for, no matter how necessary.