Wikipedia:Reference desk/Archives/Computing/2010 December 9

From Wikipedia, the free encyclopedia
Computing desk
< December 8 << Nov | December | Jan >> December 10 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


December 9[edit]

Web designing software[edit]

Are there any free software like Adobe Dreamweaver that can be easily used to design web pages? —Preceding unsigned comment added by 202.124.190.218 (talk) 02:05, 9 December 2010 (UTC)[reply]

Please see our comparison of HTML editors article. 118.96.154.36 (talk) 02:31, 9 December 2010 (UTC)[reply]

kompozer —Preceding unsigned comment added by 82.44.55.25 (talk) 14:23, 9 December 2010 (UTC)[reply]

Free classic science fiction game?[edit]

A colleague of mine is teaching a course on science fiction to college undergraduates, and would love to include a video game of some sort in some way. He had originally thought about Deus Ex, but I pointed out to him that though the game had some interesting cyberpunk aspects, it would be a difficult one to use because 1. it takes forever to really get to the very science fiction aspects (I played it years ago and remembered it taking a few hours to get to the stage where you learn about the nanobots and genetic engineering and etc.), and 2. it wasn't free and wasn't going to be supported on many modern systems without lots of hassle. (I seem to recall installing it being a hassle even on a system of its era.)

I'd been trying to think of something that could be used in its place. Ideally it would be something that could be truly easily cross platform and currently free yet still "classic" enough that it would be indicative of its time and "deep" enough to have science fiction themes that could be discussed (e.g. in comparison with Gibson's works or whatever).

I'd hoped that the source for System Shock had been released but such is apparently not the case. I seem to recall seeing Quake 2 (whose source has been released) having been ported to play in a browser window which seems like a good solution except for the fact that Quake 2 is, if I recall, pretty shallow in terms of plot. Combing through Category:Commercial video games with freely available source code didn't turn up anything obvious that fit the bill.

Abandonware is not a permissible option (too much liability to get in trouble with the school administration). It might be possible to set up a paid-for game on a commonly accessible computer or something so that the students could play for a few hours. A boring approach might involve playing the game while students watched, or resorting to YouTube videos.

Maybe some kind of old school text parser, like Zork, but science fiction? I don't know.

Any better suggestions? --140.247.11.242 (talk) 03:13, 9 December 2010 (UTC)[reply]

Well, if you're open to using Infocom games (and I don't know if you are since you mentioned Zork but also said no abandonware (Perhaps you could use The Lost Treasures of Infocom?)), you might try A Mind Forever Voyaging. If you want something free, you might browse [1], but such games may be less "classic" as they may be newer and/or less well-known. For very old things (Not modern interactive fiction, which can be run under Frotz or Glulx) you can use Dosbox. --NYKevin @205, i.e. 03:55, 9 December 2010 (UTC)[reply]
Geez - I have a dim memory of a text-parser game where you had to figure out (first) that you were a disembodied brain controlling an entire complex of some sort, then (second) how to control all of the functions of the complex before you (in one of a number of different ways) killed everyone in the complex including yourself. I have no idea what it was called, though - it's back from the height of the Zork era. (which was before my time, really, but I've always been a bit old-school ). --Ludwigs2 04:10, 9 December 2010 (UTC)[reply]
Suspended? -- BenRG (talk) 04:57, 9 December 2010 (UTC)[reply]
The white chamber is free and pretty good, but it's recent (2005) and more horror than science fiction. In interactive fiction, Adam Cadre's games are uniformly excellent, and Narcolepsy, Photopia, and Shrapnel have science fiction elements. Likewise Emily Short. Some other possibilities from the 1990s are Babel, Delusions, Erehwon, For a Change, Glowgrass, Jigsaw, and Spider and Web. I'm not sure these are quite the sort of thing you're looking for, probably because I'm not entirely sure what science fiction actually is... -- BenRG (talk) 04:57, 9 December 2010 (UTC)[reply]
Here's a list of free games ordered by year from MobyGames. I don't know of any way to restrict the list to science fiction. Two games that jumped out at me are The Hitchhiker's Guide to the Galaxy (computer game) (most Infocom games aren't free, but this one apparently is) and Beneath a Steel Sky (works on modern computers via ScummVM). -- BenRG (talk) 05:34, 9 December 2010 (UTC)[reply]
Re H2G2: [2]. --NYKevin @847, i.e. 19:19, 9 December 2010 (UTC)[reply]
I see that there's an open source remake of UFO: Enemy Unknown here: [3]. As a frequent visitor to the interactive fiction archives, I'm in a position to say that text adventures are all awfully dull and you should avoid them. 213.122.59.245 (talk) 06:04, 9 December 2010 (UTC)[reply]
Personally, I'm much more bored by strategy games than text adventures. It is, unfortunately, probably true that any game will bore some fraction of the students. -- BenRG (talk) 22:21, 9 December 2010 (UTC)[reply]
To paraphrase what somebody once said about golf, an adventure game is a good story, ruined. 81.131.43.12 (talk) 23:57, 9 December 2010 (UTC)[reply]
For a pure sci-fi plot with aliens and star systems and artifacts and progenitors of our species etc... I absolutely loved the Star Control series... but check out Star Control 3. You should be able to get it for cheap because it was released in 1996. The series was fun to play, and contained a great deal of sci-fi text. It also did not take long to get into the nitty gritty of the game. In terms of pure cyberpunk I don't think there's too much (Category:Cyberpunk_video_games) but there's a new Tron (2010) available... not sure about the price though. Sandman30s (talk) 12:57, 9 December 2010 (UTC)[reply]
Star Control II is free software under the name The Ur-Quan Masters, which is community-maintained and cross-platform. I think the changes from the original version are slight, and mainly about usability or cosmetics or something, but I haven't played it much, and I never played the original. (Interestingly, Adam Cadre, mentioned above considers SC II to be "perfection".) Paul (Stansifer) 16:23, 9 December 2010 (UTC)[reply]

IT jobs in 2013[edit]

what will be the job oopurtunities in IT field in the year 2013 — Preceding unsigned comment added by Josephite.m (talkcontribs) 04:14, 9 December 2010 (UTC)[reply]

WP:NOTCRYSTAL --LarryMac | Talk 13:15, 9 December 2010 (UTC)[reply]
If you use your favorite Search engine to search for "long term job forecast", or similar search terms, you will get a hodgepodge of extended forecasts, often for individual states. In Florida, for example, it's predicted that "job gains in new information technology sectors are muted by job losses in the more established information technology sectors of this industry" (this is the forecast for through 2017). Moody's will sell you very detailed job histories and predictions for different states. Looking at their free sample (For New York, from 2003, so take the data with a huge grain of salt), it looks like the number of IT jobs (under "Internet Service Providers, Web Search Portals, & Data Processing Services") will continue to grow, though fairly slowly. If you want up to date predictions, you can by a single report for 235 USD, or a yearly subscription (with 12 issues) for 2175 USD. I would say that in general, the information technology field is a fairly safe field to go into; there's likely to be jobs available for a long time, and they tend to pay pretty well. Buddy431 (talk) 21:54, 9 December 2010 (UTC)[reply]

Firefox remote functionality[edit]

I am curious about what is actually happening in the following scenario. I assume that this functionality is called "remote" because it is turned off with the option "-no-remote". The steps to produce this functionality are:

  1. Login to a Linux computer (call this computer "local").
  2. Run Firefox on the local computer.
  3. SSH with X-tunnel to a remote Linux machine (call this computer "remote").
  4. Run Firefox on the remote computer.

What I expect to see is two Firefox windows on the local machine, one running on the local machine and one running on the remote machine. What I actually see is two windows on the local machine, both running on the local machine. Instead of launching Firefox on the remote machine, it simply tells Firefox on the local machine to spawn a new window. So, how does that happen? (Please be very technical - I'm looking for the exact inter-process communications being used between the remote and local computers that allow the remote computer to know that I'm running Firefox on the local computer without asking permission to monitor processes on the local computer.) -- kainaw 14:17, 9 December 2010 (UTC)[reply]

As a guess, firefox uses Inter-Client Communication (ICCCM) which is part of the X-Window protocol. On start-up firefox will attempt to register itself (unless -no-remote is used), if this fails then an instance of firefox is already running. On failure, the second firefox uses ICCCM to send a message to the first firefox with the URL(s) it was asked to open, and then exits. The first firefox then opens those URL(s). As long as both firefoxes are being shown on the same display, they will interact. Neither needs to be actually running on the same machine as each other, or the display's machine. CS Miller (talk) 14:28, 9 December 2010 (UTC)[reply]
It may do it with atoms rather than ICCCM. A brief look at Firefox's X protocol traffic (with xtrace) shows it interning some atoms like this:

000:<:0007: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_VERSION'
000:<:0008: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_LOCK'
000:<:0009: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_COMMAND'
000:<:000a: 28: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_RESPONSE'
000:<:000b: 16: Request(16): InternAtom only-if-exists=false(0x00) name='WM_STATE'
000:<:000c: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_USER'
000:<:000d: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_PROFILE'
000:<:000e: 24: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_PROGRAM'
000:<:000f: 28: Request(16): InternAtom only-if-exists=false(0x00) name='_MOZILLA_COMMANDLINE'
-- Finlay McWalterTalk 14:48, 9 December 2010 (UTC)[reply]
I didn't think ICCCM could work because it allows communication on the same X server. I am running these on two different X servers. Atoms, on the other hand, are used by the displaying X server. If I run firefox on two different remote computers without -no-remote, I see one set of -MOZILLA variables in xlsatoms. If I run them with -no-remote, I see two sets of -MOZILLA variables. So, it appears that they are sort of abusing atoms as a means of communicating with one another. -- kainaw 16:18, 9 December 2010 (UTC)[reply]
How are you running them on two separate X servers? Your "SSH with X-tunnel" means you're reusing the local one. --Tardis (talk) 16:52, 9 December 2010 (UTC)[reply]
Sorry for not being detailed about the second experiment I did. I used ssh -X into one machine. For the other, I didn't ssh into it at all. I just exported the display from the computer itself to make it show up on another machine. I was looking for a difference. What I found is that firefox sees and communicates with every other firefox session on the computer on which it is displayed. -- kainaw 16:55, 9 December 2010 (UTC)[reply]
"The computer on which it is displayed" is the X server, not the one on which the executable is running. --Sean 19:46, 9 December 2010 (UTC)[reply]
It may do it with atoms rather than ICCCM. — This is meaningless. The ICCCM is a set of standards about how X clients communicate, which they do (amongst other things) by means of atoms. Marnanel (talk) 22:46, 9 December 2010 (UTC)[reply]
I'll note the following: when I start Firefox through an SSH session, it's typically because I want to appear to be using the remote IP address. (For example, if I'm at home and I log in to a university computer, I have access to the subscription journal websites). But if the Firefox actuates a Remote mode, there is no proxying - I do not have access to the subscription sites. From this, I gather that Firefox is not running anything on the sshd server side. When I log in to the SSH server and start Firefox, I assume that ssh sends a command back to my local computer to run Firefox entirely locally (or open a new tab in a currently-running local Firefox). It seems that the mechanism for this is that the firefox command actually runs a shell script - which delegates to another script, run-mozilla.sh. It is this script, which executes a local command. (I'm not sure if this is a capability of SSH: can you force a command to be run on the client machine)? I am still looking up what exact procedure that involves - but I suspect it's a matter of sending the correct escape-key(s) to SSH. (Possibly ^Z). Nimur (talk) 21:06, 9 December 2010 (UTC)[reply]
Though, this post indicates that Firefox actually does launch remotely, and then hacks up the X11 protocol to send an inter-process communication over to a Firefox running on the client-side. I can't confirm that behavior independently, but it seems more plausible than SSH itself allowing code to execute on the local machine. The question now becomes, "can X11 merely send a message to a firefox process already running on my local computer, or can X11 actually cause execution of arbitrary code on my local computer?" Nimur (talk) 21:20, 9 December 2010 (UTC)[reply]
The relevant communication is done by XRemoteClient::DoSendCommand in XRemoteClient.cpp, which uses XChangeProperty to manipulate MOZILLA_COMMAND_PROP, one of the atoms I mentioned above (it looks like they use MOZILLA_RESPONSE_PROP for the response). I don't know how durable this protocol would be if there were lots of users instances banging out requests and responses, but for the occasional use it actually gets, this seems fair enough. -- Finlay McWalterTalk 21:49, 9 December 2010 (UTC)[reply]
But that still doesn't explain how the a Firefox process can be started on the local machine, if it was not already running. Can X11 initiate remote code execution? Nimur (talk) 22:21, 9 December 2010 (UTC)[reply]
Nobody has said that a Firefox process will be started on the local machine if one isn't already running. X will not kick off a new process just because you changed some root window properties. Marnanel (talk) 22:38, 9 December 2010 (UTC)[reply]
Right, that's my understanding, and how it behaves for me when I try it out. If I have two machines local and remote, where I'm sitting at the X display on local and there's nothing but an sshd running on remote. If, at in a terminal window on local I write ssh -X remote firefox http://foo.com, what happens depends on whether I have a firefox instance already running on local. If there is a local firefox, the ssh command runs a remote firefox, it does the X atom thing above (which it can because of the X forwarding), sends the message, and closes, and http://foo.com opens (in a new tab) in the local firefox instance. If there isn't a firefox running on local, that atom procedure fails, so the remote firefox on remote runs, creates its own X window (which ssh forward back to local). In the former case the remote firefox session only runs for a fraction of a section before terminating (and exiting the ssh connection), whereas the latter case the remote firefox process persists as long as I use it (or the ssh connection survives). I don't see any remote execution beyond what you'd expect ssh to do. -- Finlay McWalterTalk 22:47, 9 December 2010 (UTC)[reply]

If all 13 root nameservers were simultaneously down[edit]

Would every attempt by a computer user using a browser to go to www.example.com (where you replace 'example' with a real website) result in a "server not found" error? Would it be crippling, or is the degree to which the caches of local nameservers who know nameservers who know where such and such website is statistically good enough that a lot of people would still be OK? 20.137.18.50 (talk) 18:06, 9 December 2010 (UTC)[reply]

It depends on your ISP's name-server caching policy. Most of them probably update their cache every few hours, and some of them might use wacky "smart" caching (i.e. if they can't get a connection to a canonical server, they might drop the record, or they might keep the old version). Now that DNS servers are running software implementations, the possibilities are pretty much infinite in terms of what they could do. But most probably, they would keep their records (because there's a critical difference between "can't connect to a canonical name-server to look up example.com" and "canonical nameserver no longer has a record for example.com"). Depending on the size of their cache, some users would never notice anything; but the "long tail effect" seems worth mentioning, and many users would immediately see DNS resolution problems (by trying to visit sites that are not in the local cache). And it would be a pretty big problem when IP/dns-name mappings started changing. You can read about BIND, a very common DNS server software. In fact, an entire book exists, DNS and BIND, published by O'Reilly, and containing several chapters about the caching algorithm(s) it uses. Particularly, note that the net-ops engineer can configure the Master / Slave relationship in any way he or she wants; and can configure DNS zones in any way they see fit; presumably, they might hand-tweak this sort of stuff, or they may run analytics and auto-generate configurations. If a specific ISP sets up a master server with a huge database, they are almost entirely independent from the root nameservers (except that they will never know about new, deleted, and changed IP/dns records). It is probable that your ISP uses BIND for its DNS server, but the only way to be sure is to ask someone in your ISP's network engineering / operations group. Nimur (talk) 19:31, 9 December 2010 (UTC)[reply]
I have to add that there's a hell of a lot more than 13 nameservers. There can't be more than 13 addresses for them, but anycast means that multiple servers can share an address. In fact there are currently 243 root nameservers. All of them going down at once is rather unlikely. Marnanel (talk) 22:35, 9 December 2010 (UTC)[reply]
honestly, "all 243 servers" going down at once is about as likely as "all 13 servers" going down a once (and not because the probability is 0). 82.234.207.120 (talk) 07:38, 10 December 2010 (UTC)[reply]

Samsung TVs with DLNA are being fussy with h264 media[edit]

I've tried a couple different 2010 Samsung TV's with their "allnet" DLNA media playback feature. Every single h264 .mp4/m4v refuses to play on it. I get a "Not supported file format" error. These files shared over DLNA play just fine on a PS3 and WDTV box. And the TV manuals clearly state h264 is supported up to 1080p and 10mbit/sec (which these media files are). Is there something I can do on my end to get these movies into whatever strict specification the Samsung TVs are expecting? --70.167.58.6 (talk) 20:55, 9 December 2010 (UTC)[reply]

Find out the limitations of the Samsung TV's H264 support (HP level 4.1 perhaps? I believe the PS3 supports level 5) and in future make sure your H264 encodes are restricted to what you TV supports. If these are commercial encodes, look for encodes that are likely to be compatible or ask the vendor if they can release versions for your TV. Samsungs are popular TVs so it may be possible. You can of course transcode, but it seems a bit of a waste of effort when you can just encode it properly or get better versions from your vendor. According to [4] the limits are well discussed so it shouldn't be hard to find although I wonder if asking Samsung if the manual doesn't list any more details may be a better bet. [5] suggests you want to stay under 8 reference frames. Nil Einne (talk) 15:24, 12 December 2010 (UTC)[reply]

Declarations and definitions[edit]

I'm fairly confident that I know what these are (having read The C Programming Language carefully), and that the article Declaration (computer science) doesn't, quite. I left a comment to this effect on its talk page back in April, but nobody reacted. I've just discovered, to my disgust, that Definition (disambiguation) links to the Declaration article, and calls a definition "A statement declaring ...". So am I being too much of a K&R fanboy, and failing to get with the modern world in which declarations and definitions are the same thing, or is this all a mess? 81.131.43.12 (talk) 23:53, 9 December 2010 (UTC)[reply]

If it says that, I'd declare the definition wrong... I'll take a look, and see if I can make any sense of it. AndyTheGrump (talk) 00:10, 10 December 2010 (UTC)[reply]
Wait, actually, I think a couple of people have improved the article in recent months (without saying anything on the talk page) and possibly fixed it. I wonder what I should do with the disambiguation, though. A definition blatantly isn't a declaration, yet "declaration" is the right article to link to. Maybe I'll just rephrase the disambig page a bit.
...Done. (Ugh, that "and/or" is a bit ugly.) Thank you for your moral support. 81.131.43.12 (talk) 00:14, 10 December 2010 (UTC)[reply]
Yeah, the Declaration (computer science) article looks about right, though it could do with clarifying the distinction between definition and initialization for variables: they aren't necessarily the same thing in some languages. The disambiguation still needs a little work, it needs to refer to functions as well as variables.
I wonder if the article should be called Declaration and definition(computer science)? I know Wikipedia practice is to avoid 'and' in article titles, but the two concepts are so closely interlinked it might make more sense that way. We'd need disambiguations for both 'declaration' and 'definition' though. Any thoughts? AndyTheGrump (talk) 00:31, 10 December 2010 (UTC)[reply]
Well, it might be that declaration is the broader concept. The "variables" section of the article talks about languages which allow (or require) implicit declarations. This is a common phrase; so far as I know, "implicit definitions" isn't, even though, technically, this is what's being talked about in that section of the article as well. All the examples in that section are testing to see if x and y can be implicitly defined as well as implicitly declared; and most of the error messages say "x not defined", but the concept is still called "implicit declaration". Searching for that takes me to the page Undefined variable, where I read "An undefined variable in the source code of a computer program is a variable that is accessed in the code but has not been previously declared by that code." Argh! So maybe I am being overly fussy about a fine distinction. 81.131.43.12 (talk) 01:03, 10 December 2010 (UTC)[reply]
I think you are right about the distinction being real enough (in most languages), but part of the problem is that different terminology is used for the same thing in different languages etc. Fortunately, I doubt many people try to learn computer programming from Wikipedia articles (at least, I hope not). AndyTheGrump (talk)
Every definition is a declaration; see WG14/N1124 section 6.7. I edited Definition (disambiguation) to agree more or less with that definition of "definition". -- BenRG (talk) 04:55, 10 December 2010 (UTC)[reply]

Spyware/virus protection for mac[edit]

I had always heard in the past that macs were basically immune to viruses. I have no idea if this is true or if it was true but is not any longer. My brand new iMac should be arriving tomorrowish and I have a few questions. Does it need any virus and/or spyware protection? If so, what do you recommend? (I'd prefer free, though I am willing to pay if it's really the better route). On that front, I will be transferring all manner of files from my soon-to-be-defunct PC over the the mac. I would not be at all surprised if this computer is infected (despite having Norton running continuously for years), as I have a lot of problems that I think may be the result of a virus. But I just can't abandon my files; hundreds of pictures, videos, word docs, pdfs, and so on. If you tell me I need antivirus, will I immunize myself (for the most part) if I first install that software and and only then do the file transfer—will it scan them as they're being transferred and block the transfer of infected files? If you tell me I don't need any antivirus software, how can I protect myself to do this transfer or do I need to at all? Thanks in advance.--162.84.137.228 (talk) 23:57, 9 December 2010 (UTC)[reply]

It it not so much "immune" as in "irrelevant" to viruses. (It also helps that they aren't logged in as administrators by default, are running a pretty rigorous Unix back-end, and lack the standard Windows vectors like MS Outlook or MS Office or Internet Explorer. MS Office exists on the Mac but its scripting limitations are pretty high, and the most recent Mac versions have eliminated scripting altogether. Which is actually irritating, but whatever, in this context it is positive.) Anyway, I have been using a Mac for 6 years now without the need for any virus or malware protection. (And I monitor things pretty well and would know if something was on there that shouldn't be.) So I really wouldn't worry about it. There is just no real threat. As for the PC viruses, they won't run on a Mac, so it doesn't really matter from the standpoint of protecting the Mac. --Mr.98 (talk) 00:00, 10 December 2010 (UTC)[reply]
Apple's official Mac OS X security page explains the built-in protections. The protection is almost identical to the protection that Windows offers: the operating system will ask you for permission to run any unfamiliar program. If you choose to run a program, and it turns out to have malicious intent, it doesn't matter what operating system you have. Here's a pretty decent listing of malware known to run on Mac computers. Especially take note of malware that attacks a browser or browser plugin - in this case, the malware is operating-system and hardware-agnostic. Nimur (talk) 00:07, 10 December 2010 (UTC)[reply]
That page notwithstanding — I know lots and lots of people who use Macs who I would call non-techy. None of them to my knowledge have ever had any trouble with malware. All of the people I know with PCs who are non-techy have endless malware issues. I just don't think there's enough of a community of Macs and Mac viruses to sustain any serious infections to the degree that there are on Windows machines. That could change, of course. But I don't think it's worth the performance hit of antivirus, personally, in terms of a risk tradeoff. --Mr.98 (talk) 01:48, 10 December 2010 (UTC)[reply]
Okay, so I guess I won't worry about it too much. Thanks for the tips.--162.84.137.228 (talk) 01:49, 10 December 2010 (UTC)[reply]
One more question. I read the security information page Nimur linked. It seems some of the protection is geared towards using Safari. I am partial to Firefox. Am I asking for trouble by staying with it? Regarding "risk tradeoff" I have always thought of my antispyware and antivirus programs as almost viruses in and of themselves. A necessary evil but quite, quite evil; a constant system messing, crash inducing, resource robbing plague.--162.84.137.228 (talk) 01:55, 10 December 2010 (UTC)[reply]
I don't use antivirus or antispyware software on Windows and have never had any problem. I think that the people who do get malware get it mainly from warez, stupid dancing frog animations, or visiting dodgy websites with an unpatched browser. I only download software that doesn't suck and I keep my network apps updated, and that strategy seems to work. I also use Firefox with NoScript, but I don't know if it's ever protected me from anything. -- BenRG (talk) 04:37, 10 December 2010 (UTC)[reply]
I use Windows and always look out for suspicious behaviour on my laptop, and for unexpected files appearing, and, every time I've investigated, the culprit has been security software doing unauthorised and unnotified background updates! Sometimes its behaviour seems to be as bad as that of the viruses and other malware that it protects against, and it has never found anything harmful in many years of use, but I'm not quite as careful as BenRG, so I keep the annoying software just in case I make a mistake. Dbfirs 10:13, 12 December 2010 (UTC)[reply]
I do not use MAC but I think that the threat of virus on MAC is clearly lower than on Windows at the moment. However if you plans to transfer files you suspect of being infected I would definitely scan them, you do not want to infect your friends computers. In order to be reliable a scan shall be done on a computer that is known to be uninfected. You could copy all the files to the new computer and then run a virus scan on them. In order to do that you do not need to install a resident anti-virus program. I do not know which virus-scanners to recommend for MAC. --Gr8xoz (talk) 11:13, 12 December 2010 (UTC)[reply]