Jump to content

Wikipedia:Reference desk/Computing

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 89.242.99.245 (talk) at 19:54, 30 November 2009 (Is it good programming practice to use state variables?). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Welcome to the computing section
of the Wikipedia reference desk.
Select a section:
Want a faster answer?

Main page: Help searching Wikipedia

   

How can I get my question answered?

  • Select the section of the desk that best fits the general topic of your question (see the navigation column to the right).
  • Post your question to only one section, providing a short header that gives the topic of your question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Don't post personal contact information – it will be removed. Any answers will be provided here.
  • Please be as specific as possible, and include all relevant context – the usefulness of answers may depend on the context.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.



How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
See also:


November 25

Google Books results

What accounts for the variability in Google Books results? I was able to find two sources the other day that now, will simply not come up no matter what I do. They are fairly old books, so I cannot figure out what I'm doing wrong. Viriditas (talk) 10:13, 25 November 2009 (UTC)[reply]

Google's search algorithms are proprietary - they are not explained, and they can change any time Google wants. The variability could stem from a variety of technical issues (e.g. database server temporarily unavailable) or algorithm changes (new server software version doesn't recognize your keywords) to malicious censorship (Google noticed a search trend with that search result, and decided to censor it in order to promote commercial competitors). In fairness, it is also plausible that you aren't searching with the same exact query (or you were logged in last time, etc.) Google is known for location-based, login-based, and other context-sensitive results. Because Google's search algorithm(s) are proprietary, there isn't any way to know for certain which is the case. Nimur (talk) 13:41, 25 November 2009 (UTC)[reply]
It's the same query and I wasn't logged in, but my location did change, approximately three times. And I did get three different responses for a few days. However, after repeatedly searching and receiving the same results for almost a week, the two books disappeared from view and can no longer be retrieved, as if they never existed. I have managed to track one down in my area, so that's a hit in RL, but the other one is gone. I've been using Google Books for a while, and this has never happened before, but I did notice that the one that disappeared is a foreign-language edition. Viriditas (talk) 04:10, 26 November 2009 (UTC)[reply]

Why did my CD/DVD Drive Disappear from My Computer and Device Manager?

Please forgive me if this has already been naddressed elsewhere. If so, I'd be grateful for the link. I searched and didn't see anything resembling this, though.
I have a HP Pavilion DV6646US laptop computer-- 160gb HD, 2gb RAM, running Vista (64). I have owned it for around two years. I still have a little over a year left on an extended warranty. (Phew.) Here's the story:
On three different occasions in these past two years, my CD/DVD optical drive (E:, on my system) has simply disappeared from My Computer and Device Manager. Nowhere to be found. The first time it happened, a little over a year ago, I presented my problem to the extended warranty people, and they had me send it in for service. I think they replaced the motherboard and the hard drive. They returned it in almost brand-new condition. I reloaded all my programs and data and everything seemed to be OK. Then, about four months ago, it happened again. I called the extended warranty people again, and the representative was convinced that it was a driver conflict with a Windows Update,and basically chewed me out (!) for installing all suggested updates (setting my computer to "automatically update"). He recommended either doing system restores, stepping backwards, until my CD-DVD drive reappeared, or uninstalling Microsoft Updates individually until I found the "culprit," or if neither of those worked, doing a complete system restore. I tried what he suggested, and it didn't work, so I did a system restore from a partitioned "D: Drive" used exclusively for recoveries. To my relief, the CD/DVD drive was back, and working-- but a week or so later, it disappeared again. I called the warranty people back, and they had me send it in again. When I got it back, the invoice said that the motherboard and the hard drive were replaced (again). I was smooth sailing, doing automatic updates from Microsoft and HP, until a few days ago when I noticed that once again, the CD/DVD drive was missing.
I would certainly prefer not to send it in a third time if I don't have to. It's not a question of cost, as it's covered under warranty-- I'm just not convinced that it's a permanent fix, AND, it's a round trip of about 2-3 weeks. I checked the HP website for a fix, and they seem to mention it here, but neither clicking on the "fix it" link, or following the provided instructions, will fix the problem (in my case). This is because it's not even FOUND in the device manager, and choosing "scan for hardware changes" doesn't find the missing drive, either. I would say that it's a dead or dying CD/DVD drive, and I could support that theory, but it IS getting power-- the light blinks when I'm booting up, and it opens/closes OK. (I realize that there's more to it than that....)
So here's my question: Has anyone else out there had this same problem, with a different result? And what did you do? Or what should I do? Could it be something with the BIOS? I'll check it when I get home from work tonight, and update accordingly (but I'm not too confident in that....)<br.> Sorry this is so long. Thanks a million-- Kingsfold (talk) 14:43, 25 November 2009 (UTC)[reply]

Have you tried the steps here? Looks like the likely source is a conflict with bad DVD burning software. The KB includes instructions to fix the problem. —ShadowRanger (talk|stalk) 14:59, 25 November 2009 (UTC)[reply]
I tried the "fix it" link there a few days ago when I first noticed it, but I didn't try to fix the registry myself. (Not afraid to, per se, I just figured that the "fix it" thing would do the exact same thing.) I will check my registry when I get home tonight and see if it does any good. Thanks for the link. Any other suggestions? Kingsfold (talk) 15:35, 25 November 2009 (UTC)[reply]
Well, if the software fixes found online don't work, you might take a stab at replacing the DVD drive and related cables. A new DVD drive runs about $20 last I checked. The tech support morons apparently made no attempt to replace your DVD drive, they just replaced the motherboard and hard drive. Try getting a new DVD drive (from a different manufacturer just to be certain) and installing it (borrow a neighborhood geek to help if needed). This way, if the problem is either the specific hardware, or the manufacturer's drivers, you've replaced both, and should have a decent test bed for comparison. —ShadowRanger (talk|stalk) 15:44, 25 November 2009 (UTC)[reply]
Could it be a wobbly cable/connector? Most of the time it's connected, but it gradully works itself loose with temperature changes and transporting the laptop. Sending it away for a new motherboard fixes it for a while because they have to reseat the connector when putting your laptop back together. I've no idea what you would do about it if that was the problem ... ask HP for a new DVD drive, perhaps? Astronaut (talk) 19:10, 25 November 2009 (UTC)[reply]
It's connected, at least to the power, because it blinks and opens/shuts. But yes-- perhaps the connector itself has worked itself loose. Hmmm. Interesting thought. Kingsfold (talk) 22:19, 25 November 2009 (UTC)[reply]
Oh, damn. Didn't notice it was a laptop. With frequent movement, a bad connector, or even a bad case (a little warped, so the drive isn't fixed in position properly) could easily be the problem. And it also means a replacement drive will cost a bit more than $20 (and may not be possible if the computer isn't built for swapping). —ShadowRanger (talk|stalk)
Right. And thankfully, it's under warranty still, so I'd rather not get too deep into the nuts and bolts. Literally. Good thought, though. Kingsfold (talk) 22:19, 25 November 2009 (UTC)[reply]

I had a similar problem at a customer's house a while ago. I fixed it by doing a system restore (Start --> All Programs --> Accessories --> System Tools --> System Restore). You should also look in the Device Manager to see if the drive has an exclamation mark next to it (right-click on My Computer and choose Manage). If it does, then try uninstalling the device from there and rebooting. Windows will automatically re-install it.--Drknkn (talk) 19:38, 25 November 2009 (UTC)[reply]

Appreciate the idea and the input, but the system restore has been tried, with no luck. The drive isn't showing up at all, exclamation mark or not, in Device Manager.  :-(
I mentioned to the warranty guy that next time I buy a laptop, it might be one of those mini ones, and I may just get a USB CD/DVD drive along with it. He said, "Yeah, you could do that." Kingsfold (talk) 22:19, 25 November 2009 (UTC)[reply]
Update: I did a complete system recovery (reformat drive, reinstall everything) and it's still not appearing. Guess it's about time to call the extended warranty people again. (!) Kingsfold (talk) 14:00, 14 December 2009 (UTC)[reply]

Illegal content uploaded to website

I'm looking for links to relevant articles for the following scenario. Note that I'm not asking for legal advice, only links to articles.

Hypothetically, someone in the UK hosted a small website on their home computer which allowed anyone in the world to post text and images, like a forum. An illegal image was uploaded to that site, and the site owner deleted the offending image as soon as he / she noticed it, and banned the ip address of the poster. What consequences could the host face? Would they be jailed, fined etc? —Preceding unsigned comment added by 82.44.55.114 (talk) 16:02, 25 November 2009 (UTC)[reply]

A similar question was asked on November 18. As before, it's virtually impossible to answer your question without giving legal advice. But, as before, you might be interested in the following academic review of prior cases: Liability of System Administrators. This paper points to a lot of related articles of both academic and legal interest. Nimur (talk) 17:23, 25 November 2009 (UTC)[reply]
You might find The Pirate Bay trial of relevance.
Are you talking about copyrighted material or illegal material such as child porn? Most sites I've encountered say in their terms & conditions that copyrighted material will be removed on sight or on request from the copyright holder; and uploading of illegal material will be reported to the police. As to how binding the T&Cs are or whether a website owner can use their published T&Cs as protection from the law, you will need to ask a specialised lawyer. Astronaut (talk) 18:59, 25 November 2009 (UTC)[reply]

A shame that you're living in the UK; in the US, system administrators are explicitly not liable per law for the content posted by internet users if they make a good faith attempt to avoid the posting of the content and remove it on sight. Too bad I can't remember the name of the congressional act. Magog the Ogre (talk) 18:28, 27 November 2009 (UTC)[reply]

Best computer?

Short list of laptops from Newegg

Hello, I am in need of a new laptop computer and I was wondering if you could help me out. I will give you guys a short list of laptops and then if you have the time could you suggest which computer is the best bang for the buck out of the provided list? Thanks, I REALLY appreciate it!! --96.230.224.100 (talk) 16:48, 25 November 2009 (UTC)[reply]

The 2.8 GHz 15" MacBook Pro with 8GB main memory, 256GB SSD and antiglare screen. The extra 300,- for the 3.06 GHz processor are not worth it. ;-) --Stephan Schulz (talk) 16:58, 25 November 2009 (UTC)[reply]
A more serious reply is that it depends greatly on what you are using it for. If you plan to play games or do serious video editing, you'll want dedicated graphics (which rules out all the Intel GMA machines). Every one of the machines listed has 4GB of RAM; more than that is unnecessary for 99.99% of tasks, so you're fine on that score. CPU speed rarely matters anymore, and every one of them is at least dual-core (the extra core is handy for when a process goes haywire, but more than that isn't usually needed). Hard disk space is up to personal need; I've got 1.7 TB on my home machine, but most people don't even use a tenth of that. —ShadowRanger (talk|stalk) 17:03, 25 November 2009 (UTC)[reply]
If I were buying a new laptop, this is what I'd buy: [1]. I'd choose the RAID-0 option with a quad-core CPU.--Drknkn (talk) 18:17, 25 November 2009 (UTC)[reply]
If you intend on carrying it anywhere, get something small and light. You will soon get sick of carrying a 15lb, 17" monster around. You can always get a large desktop monitor and external keyboard for use at your desk. And check how big the power pack is - it's no good having a 4lb laptop with a 6lb power brick! If you are using it at school and don't have access to a power supply, you might need to consider battery life. For gaming, forget it and get a desktop machine optimised for that purpose. Astronaut (talk) 18:33, 25 November 2009 (UTC)[reply]
Agreed on weight and battery life. If this is a desktop-replacement laptop it doesn't matter, but if you're toting it everywhere, every ounce counts. Ignore the people lusting after over-powerful machines. Alienware and Mac Pro laptops are *massive* overkill for anyone not using them for gaming, video editing or similar CPU/GPU intensive tasks. Heck, for most people, even the laptops you indicate are excessive. A light, quasi-netbookish machine (it's a little large to be netbook, but smaller and lighter than most laptops) like the the Asus UL30 series will do virtually everything you need, while weighing only 1.5 kilograms (3.3 lb) with a 6 hour battery life (advertised, probably an hour less), or for slightly more weight, a 12 hour battery life (again, probably a little less in practice). The UL30A and UL30Vt differ only in the presence of a dedicated graphics card, so my earlier notes on the intended usage remain relevant. —ShadowRanger (talk|stalk) 18:55, 25 November 2009 (UTC)[reply]
Agreed. I have a 15" MacBook Pro (the tongue-in-cheek suggestion above is what I hope will become affordable by next spring, when it's due for renewal - but I won't hold my breath ;-), and use it as my sole computer. I'm very happy with it. It's great in a hotel or on the train. But it's too big for flying (coach class) or longer cycling trips, and while sufficient for most tasks, I would very much like a bigger and more ergonomically placed screen when stationary. And while it's quite reasonable value for money, it's hard to rationally justify the value when I can get a EUR 400 Acer laptop with Linux that is sufficient for 95% of all my tasks - web surfing, programming, email, photos, music, movies, writing. --Stephan Schulz (talk) 21:27, 25 November 2009 (UTC)[reply]
Thank you all for your suggestions and advice so far, I appreciate it! I wasn't really taking into consideration the weight but I will definitely factor that in now. The laptop would be used primarily for standard applications like web surfing, word processing, etc. but occasionally would need it to play games (maybe) and video/photo editing. --96.230.227.148 (talk) 17:28, 26 November 2009 (UTC)[reply]

Javascript: Resetting a Form

Hi All,

I have a search-form with some <select> elements which are filled using ajax (drill-down search?). It's working fine for the most part but I want it to reset (ie as if the reset button has been pushed) whenever the user navigates back (using the browser buttons/shortcuts) from the search-results-page. I tried putting "document.getElementById('myFormsId').reset()" but it doesnt work. The site im developing is here.

To replicate what I'm trying to do, just select a manufacturer,make,model, hit submit, then from the search results use your 'back' button (or alt+left,etc). I wanted to make it that if the user goes back the whole form is set to its initial state.

Side question, why doesn't disabled="true" for the <select> work on firefox? PrinzPH (talk) 18:03, 25 November 2009 (UTC)[reply]

There is probobly an easier way to do this, but have you tried just using onLoad to set the select to a specific item?--TParis00ap (talk) 01:48, 28 November 2009 (UTC)[reply]
The correct syntax for disabling an element is like '<select disabled>', not '<select disabled="true">'. --Sean 14:04, 30 November 2009 (UTC)[reply]

My screwed up SVG file

Hi there, I recently wrote a python script to colour in maps with specific colours. The code for it is here. Unfortunately, while it seems to create state maps correctly (see here), the county maps it creates can be rendered by firefox, but not Inkscape or GIMP and apparently not whatever Wikipedia uses to display it in articles. Gimp says it has an "XML parsing error". The map is File:US Poverty Rates.svg and can be viewed here. Does anyone know how I can fix this problem? Thanks a lot, TastyCakes (talk) 22:00, 25 November 2009 (UTC)[reply]

If you can tell me where I'd get the CSV file you used to generate it, I'll have a look see. In the meantime, I note that it is a very big SVG source (it's nearly 40 times bigger than the blank map) and renderers will likely allocate a fair amount of memory during rendering. The problem may simply be that the SVG renderers in libsvg (which I think MediaWiki runs) and the others are just running out of memory (or another resource), either due to design limitations or to a hard limit to avoid denial-of-service attacks. If that's the case, the first thing to to is to run it on just a subset of the CSV file, and see if size, rather than content, is the factor that kills it (if I'm wrong then it should fail even with a handful of counties). -- Finlay McWalterTalk 22:28, 25 November 2009 (UTC)[reply]
I also notice that the SVG contains non-ASCII characters, and doesn't declare a character encoding. Viewed in emacs you see "C:\Users\Emil\Desktop\Matt<E9>'s Work\Pix\svg" - that E9 character should probably be escaped, or a compatible encoding specified. The W3C validator barfs for what I think is this reason here. -- Finlay McWalterTalk 22:34, 25 November 2009 (UTC)[reply]
Hmm sorry I wasn't entirely clear, I made the screwed up map using this blank map, which is actually bigger than the screwed up output map. If you go and look at the blank map, I think you see the same non-ASCII characters... TastyCakes (talk) 22:43, 25 November 2009 (UTC)[reply]
Hmm the validator seems to have some problems with the blank map as well... TastyCakes (talk) 22:46, 25 November 2009 (UTC)[reply]
But the blank state map does validate... TastyCakes (talk) 22:48, 25 November 2009 (UTC)[reply]
Oops, the "40x" thing was a snafu at my end. The character literals in Matté aren't the same in the two: in the blank one that é is encoded as c3 a9, but in your one it's just a naked e9. So I think you need to get the thing emitting the XML tree to force utf-8 encoding (and emit an svg header to that effect). -- Finlay McWalterTalk 22:56, 25 November 2009 (UTC)[reply]
Ah you're totally right, that seems to be the problem, and now it works! Thanks a lot, I was at a total loss on what to do. TastyCakes (talk) 23:05, 25 November 2009 (UTC)[reply]
I'm happy to help; it's an interesting problem. I hope you'll keep (and ideally augment a bit) the process and script for producing the diagram; there are all too many machine-generated pictures in Wikipedia that we don't have the scripts for, making anyone who wants to fix or improve them rewrite the script again. -- Finlay McWalterTalk 23:09, 25 November 2009 (UTC)[reply]
Ideally I'd like to make the svg code it produces prettier, but my attempts at doing that were a disaster. I haven't tried it yet, but hopefully the same script will work to colour in blank world maps as well. Do you have any suggestions on improving it? And is there a central repository somewhere that I could link my script to? Hopefully one where I won't find a script that does the same thing already there? ;) TastyCakes (talk) 23:21, 25 November 2009 (UTC)[reply]
I just put the source on the commons description page (like commons:File:Wfm_floodfill_animation_queue.gif). The base one has a decent-looking SVG header (doctypes and stuff) whereas yours doesn't (again, I think that's probably a setting for the XML generator). On looking at both, the massive size is partly due to the (probably excessive) precision given for each vertex, and the (probably excessive) number of line segments used to represent the areas (most of which are, after all, just squares). Given the resolution we'll reasonably be rendering them, almost all the zones could stand to be 6 or 8 sided. But to do that properly would need some Douglas-Peucker polygon simplification, and I'm not volunteering for that ;( -- Finlay McWalterTalk 23:34, 25 November 2009 (UTC)[reply]
Ok, well thanks a lot. Maybe the easiest way to get the size down would be to just make it in inkscape and then convert it into a PNG? I guess it's not quite as elegant ;) TastyCakes (talk) 23:56, 25 November 2009 (UTC)[reply]

adobe digital editions

I'm trying to install adobe digital editions on my computer, in english. My computer happens to be in the Netherlands, but I don't speak dutch. Somehow adobe seems to know I'm in the Netherlands and the installation program comes to me automatically in dutch. I can't figure out how to install in english instead of dutch, or to change the language to english once the app is installed in dutch. The adobe site assures me that the app is available in many language now, including dutch...but how do I download and install it in english? Thanks for your advice. —Preceding unsigned comment added by 83.98.238.113 (talk) 23:02, 25 November 2009 (UTC)[reply]

Try changing the countries/language settings in control panel?F (talk) 05:01, 26 November 2009 (UTC)[reply]


November 26

How to burn .flv files to DVDs

Hello! How do you take a .flv file and burn it to a DVD so that it will play on a standard DVD player in the US, without downloading any dodgy software? I tried searching on Google, but most solutions recommended I download some strange freeware. Is it really that hard that you need a special program to do it? I have VLC media player and an ISO-Image-to-CD/DVD writer. I figured I could convert the video to the right format and burn it. I found this website, but I'm not sure I fully understand the directions; when I tested converting the video with the NTSC-format requirements, the resulting video was very choppy (don't know if that's VLC's fault or my possible typing in the specification incorrectly). Thanks for any help or advice on this topic.--el Aprel (facta-facienda) 01:24, 26 November 2009 (UTC)[reply]

There are 2 things you need to do to be able to view .flv video on a DVD player. The first is to convert it to a format that DVD players understand (which is called MPEG video) and the second is to add the information that the DVD player expects about the video that is there - menus and the like. This ends up with a number of file types - .vob, .ifo and .bup. This is often done in two stages - convert the format and then create the DVD. However, you can find programs that appear to do both - for example, the DVD author at AVS4YOU appears to do both (http://www.avs4you.com/AVS-DVD-Authoring.aspx). --Phil Holmes (talk) 09:53, 26 November 2009 (UTC)[reply]
Convertxtodvd is a stellar program that works really well, I highly reccommend it. The only thing is that it's commercial software that you have to pay for, although I think they offer a free trial (but it leaves a watermark on the video). --96.230.227.148 (talk) 17:11, 26 November 2009 (UTC)[reply]

Computer troubles and open source questions

Basically, I'm breaking point with windows based computers. My computer isn't up to the speed as to what it used to be. Of course it still works, but somehow its not what it used to be. The computer is a Dell Desktop from 3 1/2 yrs ago (January, 2010). Just used for emailing, internet, and etc along those lines then advance stuff. At the same time have no income in able to afford a new computer and have no help from my parents or sibling. Since somehow in the past had a tendency of breaking of computers and getting new ones short while after that. Rather do it right this time around

Think thats it for now.

Thank you, in advance.

Jessica A —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 03:24, 26 November 2009 (UTC)[reply]

I'm not sure what your question is, exactly. Do you need help speeding up your computer (very likely possible). Do you need recommendations for a new computer? Chris M. (talk) 04:17, 26 November 2009 (UTC)[reply]
The standard way to fix a slow computer is a format and reinstall.F (talk) 04:58, 26 November 2009 (UTC)[reply]
You say "...and open source questions" - are you looking to switch to Linux or something? Linux is really good for old computers for several reasons - but mostly because each new revision doesn't require vastly more computing resources than the previous one, so you can keep your OS software up to date without needing continual hardware updates to stay current. If that's what you're looking for then probably you should start off by downloading and burning a Linux "Live CD" - you can boot Linux from that and experience 90% of the joys of Linux without having to risk screwing up your PC. If you like what you see then you can install a full Linux distro - and one nice way to ease the pain & risk is to install it as a "dual boot" system where the hard drive is split between a Linux "partition" and your existing Windows software. Now you can reboot your PC and you'll be presented with a little menu that lets you boot into either the wonderful world of Linux - or into your original Windows system. This is handy because you can still boot into Windows to play games or do other Windows-only stuff. When you realise that it's been six months since you last booted into Windows - you can wipe the Windows partition and reformat it into Linux to get back some disk space. But until you're ready to commit, Linux can read files from the Windows partition so you can get to all of your data from within Linux. There are a gazillion Linux distro's to choose from - I happen to use OpenSuSE - but probably the most popular/well-supported version right now is Ubuntu...so I guess that's probably where you should start. SteveBaker (talk) 05:30, 26 November 2009 (UTC)[reply]
I'm not sure whether this would be seen as good advice, but I have a fairly old laptop (a Thinkpad X30, bought second-hand in 2007; 512 megs RAM and 1.2 GHz processor) and it used to take an age after switch on/wake from hibernate/sleep. The disk thrashed and it was pretty much unusable. I thought I'd experiment by getting rid of the anti-virus (AVG). It now boots almost instantly from sleep/hibernate and is a completely usable machine once more. I am careful about what I open/view and have never had any machine I use detect a virus, so I feel comfortable with doing this. It may be worth trying getting rid of the AV, to see whether this has a noticeable effect. However, unless you are always extremely careful, I would not leave a PC without any AV protection. --Phil Holmes (talk) 09:43, 26 November 2009 (UTC)[reply]
I agree - antivirus software is some of the most troublesome software I have to deal with. It can be a huge performance boost to run without any antivirus software at all. The caveat is that if the user doesn't know what they are doing (and sometimes, even if they do), a virus infection can be a serious problem. In general, good security practices are more valuable than good antivirus software; but your usage pattern may change factors in the risk evaluation. Nimur (talk) 22:24, 26 November 2009 (UTC)[reply]
The newest Windows edition -Windows 7 - is rumored to require less RAM than Windows Vista and run faster. Anyway, a 3 1/2 years old computer should completely be able to deal with things like email, internet surfing and the like without any noticeable problem.--Mr.K. (talk) 12:32, 26 November 2009 (UTC)[reply]

Thanx for all of the info that you guys give me on this. Basically, was asking more about linux then computer troubles. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 23:43, 26 November 2009 (UTC)[reply]

Thanx and tried it again. Having a much easier time this time around as suppose the other time. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 21:00, 27 November 2009 (UTC)[reply]

Toolbar disappeared

I'm using explorer 8 and my toolbar has disappeared. I tried all the F keys with and without Alt and Ctrl to no avail. see here Note to Astronaut: The reason I was concerned about Besser was not malwear but identity theft because of all the places on the planet he could have been he is actually about a mile from me. I thought he was somehow picking up my zip code. WOW!

Is this any use: http://support.microsoft.com/kb/962963 ? --Phil Holmes (talk) 11:28, 26 November 2009 (UTC)[reply]
This problem was discussed at some length last week. See Wikipedia:Reference desk/Archives/Computing/2009 November 20#Toolbar disappeared. I'll be interested if the advice from Microsoft helps. Astronaut (talk) 14:33, 27 November 2009 (UTC)[reply]

The advice from Microsoft didn't work. Yesterday I asked a knowledgable friend for help with this and he had done all the things which the advice from Microsoft said already (Delete ITBarLayout from the registry)The Microsoft page had the option of "Do it for me" and "I'll do it myself". I clicked the former several times and waited but nothing happened. When I went to do it myself I found that all the ITBarLayout's were already deleted, (Find didn't find any also)The Microsoft page also has a "Did this work? option which I'll go to now and say "no" and see what happens. Next I'll call Besser. I e-mailed him to no avail. Maybe, if he's still alive, he'll walk over here and help me. I just had an idea - my delete key isn't working, maybe my F10 key isn't working too. —Preceding unsigned comment added by 98.77.195.206 (talk) 16:11, 27 November 2009 (UTC)[reply]

Today I went to Dell for help. They took over my computer and noticed I was missing a lot of other things also and said I had to reinstall the whole operating system again. I didn't have a blank DVD available, only a blank CD, so I was only able to save a few pictures and music, but I reinstalled the operating system then all security then all drivers etc, then the printer. Next I'll download all I've saved and try to get back whatever I've lost. I got my toolbar back and got rid of Explorer 7. Then I'll go to windows update and spend the next few hours updating that - CAREFULLY! —Preceding unsigned comment added by 74.233.41.228 (talk) 23:14, 27 November 2009 (UTC)[reply]

Oh well; I was rather hoping you would be able to avoid reinstalling XP. Hopefully your backup to the CD was sufficient and you didn't lose too much. Astronaut (talk) 01:54, 28 November 2009 (UTC)[reply]

Thanks for all your help Astronaut. I lost a lot but I got my toolbar back and I'll never, ever try Explorer 7, 8 or any other. I'm in 6 and I'll stay there as long as I can. I hear the new Windows 7 costs $50 and has a lot of bugs also. —Preceding unsigned comment added by 98.77.186.253 (talk) 15:04, 28 November 2009 (UTC)[reply]

Both IE7 and IE8 are much superior to 6, and there will come a time when you will be unable to get updates without upgrading to IE8. However, you could always try a different browser. Astronaut (talk) 10:41, 29 November 2009 (UTC)[reply]
Well every version of Windows has lots of bugs. You should be aware you're not going to get much support for Windows XP bugs anymore Nil Einne (talk) 23:46, 29 November 2009 (UTC)[reply]

According to the above reference from Astronaut Internet Explorer 6 was once the most widely used browser version. During the summer and fall of 2009, 8 years after its introduction, IE6 still held the top spot in terms of browser marketshare. Microsoft have themselves, despite admitting to some of its many flaws, stated that they will support IE6 until Windows XP SP3 support is removed, meaning IE6 will be officially around until 2014, 13 years after its release.

60,000+ jpgs

I have a lot of images completely unorganized, with lots of the same images under different filenames. I need a simple, hopefully free software program that can browse and help organize them, search them via their metadata, and to detect duplicate images and remove them. Any suggestions of good programs which would be good for this? Many thanks for your help —Preceding unsigned comment added by 82.44.55.114 (talk) 14:08, 26 November 2009 (UTC)[reply]

f-spot or digikam? --194.197.235.240 (talk) 14:36, 26 November 2009 (UTC)[reply]
Thanks, but I should mention I need this for windows not linx
Sorry, I thought f-spot would run on windows too but google seems to think not. But digikam will run on windows (search 'kde for windows'). --194.197.235.240 (talk) 19:28, 26 November 2009 (UTC)[reply]
Not specific to pictures, but I've found Duplicate Cleaner works well at finding duplicates. TastyCakes (talk) 15:16, 26 November 2009 (UTC)[reply]
Visipics and DupDetector are two freeware programs for finding duplicate images. They can find similar pictures of different sizes. Other freeware programs called Plumeria and i-Fun Viewer can sort pictures I think. 78.146.191.42 (talk) 20:51, 26 November 2009 (UTC)[reply]
Picasa won't help you with finding duplicates (there were other suggestions above), but it's excellent at helping you to organise your images, as well as perform simple fixes. — QuantumEleven 22:55, 26 November 2009 (UTC)[reply]
Picasa has a fatal flaw in that it assumes that the file creation date is the date of the photo and stubbornly refuses to let you change it. So, if you have 60,000 pics that you took over, say, 10 years and you put them on a server to sort them with Picasa, it will claim they were all taken on the day you put the server. A very big headache for attempting to organize and sort photos. -- kainaw 23:03, 26 November 2009 (UTC)[reply]
I've found that using a Wiki (a private one) is very handy for organizing photos. The automatic creation of thumbnails - things like categories and galleries are a snap to create. If you use MediaWiki (the same software that runs Wikipedia - and, significantly the vast WikiCommons image farm) - there are extensions to let you do bulk "uploads". Categories can contain categories. Images can belong to multiple categories. You can create galleries from categories. You can add text to images - make pages that mix media. You can also store movies and sounds. When you're happy with your setup - you can easily put the entire shebang onto the web. The revision control mechanism (like the 'history' tab, above) lets you store the original photo from the camera - as well as subsequently cropped, color-adjusted versions - so you can always go back to the original pristine image if you ever need to. This doesn't solve the issue of duplicate images - but you can fix that with one of the other tools before you start. SteveBaker (talk) 01:04, 27 November 2009 (UTC)[reply]

Excel List Problem

Hello, a juicy Excel problem for you here.

I have two lists, each with a word and a number, for instance HAPPY 234. This indicates how many times the word HAPPY has appeared in a document. This list is in columns A and B of a sheet.

In columns C and D of the sheet I have a similar list - and you guessed it, I want to combine the lists so if HAPPY 234 appears in cells A1 and B1, and I have HAPPY 50 in cells C34 and D34 I will get a merged result of HAPPY 284 somewhere else.

I have found a guide [[2]] but this only works if the list is one column wide, mine is two.

Hope this makes sense, any ideas?

Thanks 195.60.13.52 (talk) 16:28, 26 November 2009 (UTC)[reply]

Pivot tables are your friend here. Move the data in your C and D columns underneath the data in A and B columns, so you have one list in two columns; add header to each column - say "Word" and "Count", for example; select the whole data area including headers; create a pivot table with "Word" down the left hand side and "Sum of Count" in its body. Pivot table will find all rows with a given word in column A and will add up their word counts from column B. Gandalf61 (talk) 16:49, 26 November 2009 (UTC)[reply]
Thanks for that, that's damn nearly got it. Only problem, the list tops out at 9500 ish - my original combined list is 13000 or so rows. Is there a limit for a pivot table? Thanks 195.60.13.52 (talk) 11:54, 27 November 2009 (UTC)[reply]
This page says there are no built-in limit for pivot table rows in Excel 2003, but you may be limited by available memory. This page suggests a limit of 8000 items, but that may be for an earlier version of Excel. You could sort your list into alpha order, then split in two to analyse, say, A-M and N-Z separately. Gandalf61 (talk) 12:16, 27 November 2009 (UTC)[reply]

Looking for the Ithkuil font available for download somewhere on the Internet

Searching through all the references on its article and many google search results has been to this point fruitless. 71.161.45.144 (talk) 17:05, 26 November 2009 (UTC)[reply]

Ithkuil is a synthetic language with few users. As such, it was not included in any of the important unicode or other character encoding standards. It seems probable that there is no standardized computer representation of the written form. Further, because its written form grammar is very complicated and position-based, a simple "font swap" inserting symbols in place of other unicode characters would be unsuitable for a full representation. This means you probably won't be able to find fonts, and will have to synthesize them yourself (or write out the language by hand, e.g. with a tablet computer), and save the writing as a raster image. Nimur (talk) 22:15, 26 November 2009 (UTC)[reply]

rsync and non-regular files

To be able to keep a backup of my entire Fedora Linux system, I copied most of the system directories (such as /etc, /lib, /usr and so on) into a subdirectory under my own home directory and then both chowned and chgrped the copies under my personal account instead of root. But when I tried to run rsync it still skipped them as "non-regular files". What makes them so non-regular? JIP | Talk 19:44, 26 November 2009 (UTC)[reply]

Block devices, character devices, named pipes, sockets, doors on Solaris, and network sockets on Neutrino. Essentially all the things that hang around in a unified unix filesystem tree that aren't files, directories, or symlinks. -- Finlay McWalterTalk 21:09, 26 November 2009 (UTC)[reply]
But the thing is, it happens even with files that aren't block devices, character devices, named pipes, sockets, doors, network sockets or such, just plain files. JIP | Talk 04:10, 27 November 2009 (UTC)[reply]
If you want to find out where such objects are (and the rsync log isn't helping you) use find: find -type b to find block devices, etc. -- Finlay McWalterTalk 21:12, 26 November 2009 (UTC)[reply]
Personally I don't backup /lib /usr /var /opt /boot or things like that; I don't really think such a backup will turn out to be useful. I backup /home and /etc (with rsync, among other things). For cases where I want to say "the system is working nicely now; I want to be able to come back to this" then I dd the physical volumes off to another physical disk. Keeping any kind of backup on the same physical disk doesn't seem like a productive idea. -- Finlay McWalterTalk 21:19, 26 November 2009 (UTC)[reply]
It's not to keep a backup on the same physical disk. I use rsync to back my entire home directory to an external disk. This way it would automatically back up the copies of the system directories too, if I could get it to work. JIP | Talk 04:10, 27 November 2009 (UTC)[reply]
Those files might be symlinks, or hardlinks. I don't know rsync well enough to know what else would cause it to throw out like that. Maybe you could post the ls -l output on some of the files it doesn't like (not that doing so would elucidate that much). Finlay's point isn't about the same physical disc as much as it's about the fact that those system directories are going to be standard, and if your system gets hosed 99% of the time you're only interested in user files and not those system ones that he mentions (although I'd include some of /var for logs). In the rare event you want to straight copy over using dd is a low hassle option. Shadowjams (talk) 09:12, 28 November 2009 (UTC)[reply]


November 27

GPS-based alarm

I would like to know if there's a program that could alarm you when it reached certain point on the map. Suppose I am sleeping while taking a train to a point B and I would be awoken in point B-1km. Have anyone heard of such a program? Thanks... roscoe_x (talk) 01:27, 27 November 2009 (UTC)[reply]

On what sort of system are you asking this for? It is a mobile phone, an iTouch, a computer with an attached GPS device? Magog the Ogre (talk) 18:24, 27 November 2009 (UTC)[reply]
I almost wonder if you're fishing here because you've described exactly an app that's on the iphone but whose name I can't remember right now. It's called iNap, and read this for more. I don't think it somehow makes you get a GPS signal underground.... Shadowjams (talk) 09:06, 28 November 2009 (UTC)[reply]

Difference/s with Suse, Ubuntu, etc along those lines

Basically, it was in one of the answers to my previous question on here. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 04:25, 27 November 2009 (UTC)[reply]

Are you asking what the difference between Linux distributions is? If so, it is primarily the package managers. Choose a package manager you like. Pick the popular distribution for that manager. For example, APT and YUM are two very popular package managers. If you like APT, you will likely choose a Debian-type Linux like Ubuntu. If you like YUM, you will likely choose a RedHat-type Linux like Fedora. -- kainaw 05:07, 27 November 2009 (UTC)[reply]
Distributions also seriously vary the license terms, so pay close attention to those if you have special commercial or legal needs. Most home users and many commercial uses are fine with "free", and need no further detail. Nimur (talk) 15:35, 27 November 2009 (UTC)[reply]
SUSE uses .rpm files to install programs and uses Zypper (not YUM) to install them via the command line. It uses YAST to install them graphically. It has traditionally used KDE as its interface, but you can also choose GNOME during installation. SUSE has been around since 1992.
Ubuntu is based on Debian, so it uses Aptitude as the package manager and .deb files to install programs. It uses Synaptic to install them graphically. It uses GNOME as the GUI, but you can also use KDE by installing Kubuntu. But Ubuntu works the best with GNOME. Ubuntu has been around since 2004.
Kainaw: I wouldn't recommend Fedora as it is used by Red Hat to test technologies before they're included in Red Hat, so I don't consider it stable enough for use by a newcomer. Likewise, I would recommend Debian over Ubuntu as Ubuntu is based on the unstable version of Debian. I've also used SUSE and it's very stable and easy to use.--Drknkn (talk) 06:15, 27 November 2009 (UTC)[reply]
Which technologies are tested in Fedora before inclusion in "Red Hat" (presumably Red Hat Enterprise Linux)? Nimur (talk) 15:37, 27 November 2009 (UTC)[reply]
There is a rather widespread, but completely false, belief that Fedora is the testing version of RedHat. The real deal is that RedHat is supported. So, every package officially included in RedHat is supported. You have to pay for that support. Fedora is not supported. You take it or leave it. So, it is not necessary that every package be supported (since none of them are). The end-result is that packages are quickly added and updated in Fedora. They are slowly added and updated (with the exception of security fixes) in RedHat. Because RedHat lags behind Fedora, most people believe that the packages are tested in Fedora for the purpose of including them in RedHat. So, you get comments such as "technologies are tested in Fedora before inclusion in RedHat." -- kainaw 21:45, 27 November 2009 (UTC)[reply]
Here are some citations to back up my claim: [3], [4]. Do you have any proof to back up your claim?--Drknkn (talk) 21:21, 28 November 2009 (UTC)[reply]
Ummm... yes. See Fedora Project. While RedHat has a lot of control over the Fedora Project, the Fedora Project steers development of Fedora. They do not simply test new packages to see if they will work on RedHat. Most of the packages available for Fedora since Fedora Core 1 in 2003 aren't even available in an official install of RedHat. So, if the claim is true that Fedora is a test-bed for Redhat, why are all those Fedora-only packages missing from RedHat? -- kainaw 21:43, 28 November 2009 (UTC)[reply]
Since we are using original synthesis in this debate, I should mention an anectdote of my own. I have used both Fedora and Red Hat Linux. After installing new releases of Fedora, I would then have to download several hundred megabytes of updates. They were clearly beta-quality releases. Admittedly, I've had more issues with Ubuntu than Fedora. Red Hat, on the other hand, has tended to be very stable, with fewer updates. Which one is the test bed? You decide.--Drknkn (talk) 22:30, 28 November 2009 (UTC)[reply]
If there's still any uncertainty, here is the Red Hat Enterprise Linux support schedule. "Test" packages are tested prior to the "General Availability" date for any release, in the developer alpha or beta period. You need to be pretty tight with Red Hat to get in on these things (usually, this means you need to be a research group or corporate customer already subscribed to some of their other products). The notion that Red Hat "tests" anything on any platform other than the main line RHEL is just silly - it would defeat the purpose of testing and compatibility certification. The "test bed" is exactly that - a pre-release RHEL. Nimur (talk) 22:35, 28 November 2009 (UTC)[reply]
Sorry, I failed to comment directly on your references. Had two screaming kids in my lap. The first one is correct only because it is rather old. When it was written, RedHat (and most of the world) saw Fedora as nothing more than a test-bed for RedHat. That view has changed. The second reference is also correct. RedHat likes to think that they own Fedora and they can use it to test things for RedHat. However, the Fedora Project doesn't sit around wondering how to make RedHat better. They are concerned with Fedora. If RedHat (or any other distro) benefits from their work, so be it.
I don't feel it is fair to call any distro (Fedora or Ubuntu) a "test-bed" just because it requires a lot of updates. There are two common mindsets for distros. Some are cutting-edge. When a program is updated, the package for that program is updated. Others hold back. Only when a program has a completely stable release will the package be updated. Since most distros use open-source software, the updates are continuous. It isn't like GIMP or Firefox are going to hit a "final version". They keep getting developed. If you demand to have the most recent version of all the programs, use a cutting-edge distro. If you require absolute stability, use one that doesn't update packages very often - and when they do, it is to a rather old version. As for test-bed, both Fedora and Ubuntu have a test-bed, which is the "testing" package repository. If you want real cutting edge with all the headaches, switch to the testing repo. -- kainaw 22:41, 28 November 2009 (UTC)[reply]

OP: I wrote a very long response to a similar question back in May. For your convenience, I will repost it here. In brief summary, Kainaw's description pretty much hit it on the head - most people know their Linux distribution for their package manager. Most linux and unix users would have a hard time actually knowing which distribution they were actually on, if you took away the logos. Only a few esoteric details, and some nitpicky details of configuration, separate the major linux distributions today; in fact, even most Linux/Unix/POSIX-like operating systems are pretty similar in 2009. So with this being said, I re-post my May 2009 response:

I spent a good portion of my undergraduate time working between different unixes and linuxes and qnixes and things you've never heard of. Boy, is it confusing! First of all, you've made the important first step in comprehending that the front-end user interface (GNOME or KDE or fvwm or whatever) is not the operating system distribution. (In fact I've run all of the above environments on all of the above *nixes and sometimes as a result I can't tell which machine I'm currently on!). And, your csh and bash and tcsh and zsh will probably run on all of the above as well. So... what's it all about? What's the difference between the distributions? (Linux distribution might help out here, but seriously... what exactly is a "distro" anyway? Why is Debian different than Ubuntu, if they both use the same package manager, same shell, same GUI, same libraries, ...)
Well, first of all, the Linuxes are all running the linux kernel, while the Solarises and BSDs and Mac OSXes are not. (And QNX? Well, just suffice to say that although it presents you with a POSIX-like shell and a lot of the standard system-calls, it's... not very much of a linux at all!) But all of them are POSIX compliant, and support networking and multithreading and encryption and so forth. But if you are going to remap your memory system for a custom coprocessor and need to recompile your kernel memory-module to handle variable page sizes based on current coprocessor instruction, you're going to need to choose your kernel carefully (I've heard, from people who would know, that CENTOS and Solaris make this task "easier"). And if you were planning to do something more benign, like maybe mixed shared memory programming with OpenMP and a little pthread code in the same program, you might actually find that there's a difference in the dynamic scheduler capability for different incarnations of the kernel. Or maybe you've got some files mounted on an AFS drive and you want to ensure that the network traffic stays encrypted, all the way through the machine, past the network, up to the shell, through the user-space, and decoded at the point-of-use in some kind of protected memory. Then you better have a kernel with libPAM module support! Are you doing these things? If not, you may never really notice your distribution.
Backing up a notch or two, at the "intermediate" level, you are going to want to install or compile some program some day which is going to have some dependencies. A lot of libraries are pre-packaged and precompiled for the common distributions (in the form of a DPKG or an RPM or sometimes even straight-up .so files). Pick a distribution that's going to be used by people who work with things that you work with... that way, you'll have a community which has already prepared the sort of tools you are going to need. It's not often worth anybody's time to trace back seven levels of library-dependency when you just want to get a standard tool to run.
Compiler support may be an issue between vendors. Some of the more esoteric optimization flags and the less standard extensions (like some c99 complex-math support) turns out to be not very platform-portable - this usually means that it's getting linked in with some system library (like libm.so).
So, what's the moral here? Distributions make a big difference if you're doing non-standard things; but if you follow "best practice" and write code that doesn't link with weird libraries, and doesn't jump from high-level logic to operating-system calls in the same module, you'll be better off and spend less time tracking down portability problems. I would stick with Ubuntu if I could, but some of my tools are only available on other linux platforms (and aren't worth the hassle of porting).
Hopefully this will give you some perspective - use "whatever distribution is easier." If you actually get to a point in your professional or academic development when you can decisively state that "the Solaris cilk scheduler gave me a 20% speed improvement" or "the network stack on QNX was insufficient to handle packet buffers for gigabyte-sized files using https" or some other distribution-specific issue, you're probably going to care what distribution you are using. Until then, pick a good shell, pick a good user-interface, and use as much standard unmodified software as you possibly can.

Nimur (talk) 15:43, 27 November 2009 (UTC)[reply]

Thanx for all the info. Sorry, for all the confusion that I caused with this and etc. To me all of your answers were confusing. Went with Ubuntu because it seems more stable then Suse or etc along those lines. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 21:03, 27 November 2009 (UTC)[reply]

Sorry for the confusing answers. Sometimes I give too much information, with the goal of completely explaining everything; a lot of people just need a quick summary answer. I think Ubuntu will be a good choice for most of your needs. Good luck, Nimur (talk) 05:00, 28 November 2009 (UTC)[reply]
I actually said that Ubuntu is less stable than SUSE.--Drknkn (talk) 22:30, 28 November 2009 (UTC)[reply]

Home server and net security

Hi, if I set up a computer at home to run as a Web server with Apache and Mediawiki and such, and I only aim to access this at home (all the computers will connect to the same router), am I safe from random people from the Internet accessing this server as well? In particular, is it sufficient to run one of these "port scanners" on the Web (I think Symantec or someone has one, at least), and if I'm all "stealth", I'm good? I haven't opened any ports on the router. As an add-on question, is there any easy and safe way to then open the computer to access from outside with a reliable password (or other identification) system? Thanks! Question inspired by SteveBakers MediaWiki marketing above Jørgen (talk) 12:38, 27 November 2009 (UTC)[reply]

"Safe" is relative. But if your router is not forwarding the ports, there is little additional risk. The easiest way to allow access from the outside is to forward just the ssh port, and to have strong passwords on the machine - or, even better, only allow the use of sufficiently strong public key exchange for ssh. Again, this is not absolutely safe, but likely to be safe enough for most people. If you need absolute safety, write on wax tablets and burn them before reading. ;-). --Stephan Schulz (talk) 12:51, 27 November 2009 (UTC)[reply]
Of course you could provide an extra layer of security by setting apache to only allow people from your network in. E.g., place in your .conf file:
allow from 196.128.100
deny from all

Magog the Ogre (talk) 18:22, 27 November 2009 (UTC)[reply]

Thanks both! Opening only SSH sounds like a good idea to start with. Now, would you consider apache or MediaWiki user control combined with an open HTTP port inadequate? No-one's after me, I'm just seeking protection from random hackers. Jørgen (talk) 20:09, 27 November 2009 (UTC)[reply]
Only open port 80 if it is only a web server and you aren't doing HTTPS stuff. Then, hackers can only get in on port 80. There is still the issue of scripts running on the server. Keep up on what you are running. Many scripts have vulnerabilities that come up over time and need to be updated. While the user can't alter your router through one of these vulnerabilities, they can alter the server itself and turn it into a spam-bot rather easily. A good log file manager will help you keep up on what vulnerabilities the hackers are looking for. For me, they are about 90% Windows vulnerabilities, but at least 10% are aimed at popular PHP, Python, and Ruby scripts. -- kainaw 20:57, 27 November 2009 (UTC)[reply]

What have I done to Firefox?

Everything I type into FF now comes out like ூபகே. By everything I mean Wikipedia (ஐகககஜா்கோ), the address bar and Google (உததுதா) searches. I have to type this into Notepad then copy and paste it here. I can't see anything obvious that I did to change anything but I really need to change it back. Thanks. Enter CambridgeBayWeather, waits for audience applause, not a sausage 13:31, 27 November 2009 (UTC)[reply]

Looks like you have somehow changed your keyboard language. What's your operating system? --Mr.98 (talk) 13:41, 27 November 2009 (UTC)[reply]
For what it's worth, you're typing in Tamil. Algebraist 13:44, 27 November 2009 (UTC)[reply]
I'm using XP and it appears to be confined to FF as I'm now using IE. I thought it looked like Tamil but I wasn't sure. Wouldn't have been so bad if this was mine but it's the work computer. Enter CambridgeBayWeather, waits for audience applause, not a sausage 13:57, 27 November 2009 (UTC)[reply]
It's OK. I should have thought and rebooted the computer. Still have no idea what I did to cause that but rebooting made the problem go away. Thanks. Enter CambridgeBayWeather, waits for audience applause, not a sausage 14:08, 27 November 2009 (UTC)[reply]
Has a Tamil spealker at your work, used your PC and inadvertantly (or as a joke) changed the input language to Tamil? Astronaut (talk) 14:35, 27 November 2009 (UTC)[reply]
No, it had to have been me. I've been the only person here for several hours and there was no problem until then. In fact the only reason that any other languages are on this computer is so I don't have to put up with ? or ? in articles. I don't suppose that anyone else even knows there are other languages installed. I checked and the computer doesn't have an hotkey setup to change to any language. Enter CambridgeBayWeather, waits for audience applause, not a sausage 14:43, 27 November 2009 (UTC)[reply]
If you have multiple keyboards/languages defined, there is a hot-key sequence to switch between languages - on my PC Alt-LeftShift will switch between English and Japanese input. First check to see if you can make the language bar appear: right-click on the taskbar and see if "language bar" is one of the toolbars you can activate. if not, you will have to look for it the Control Panel. When activated, there should be an option for the language bar settings which include ket settings. Unfortunately, I cannot guide you more accurately because I know that is one of the things that was changed in the move from XP to Vista. Astronaut (talk) 16:52, 27 November 2009 (UTC)[reply]
Thanks. I activated the language bar right after the Tamil started appearing. It shows that we can input in Tamil, two other languages and US English but was set to English and none appeared to have a setting for a hotkey. Right now I'm at home so I can't check to see what the other two were. I thought that changing that would make the default input into all programs the same and not just one. I wouldn't really be that bothered but if I can do it that easy then so can someone else. What makes it worse is just how long it took me to think and reboot. I should have tried that first! Enter CambridgeBayWeather, waits for audience applause, not a sausage 00:11, 28 November 2009 (UTC)[reply]
Ah, now I'm on an XP machine. If the language bar is not visible, go to control panel and choose "Regional and Language options". On the "Languages" tab, click "Details" to make the "Text services and Input Languages" box appear. At the bottom of the "Settings" tab, the "Key Settings..." button lets you set up the hotkey sequences. If the language bar is visible, choose "Settings..." to bring up the "Text services and Input Languages" box. Astronaut (talk) 01:51, 28 November 2009 (UTC)[reply]
Thanks. I started it up yesterday thinking it would be the problem but the input was set to US English. The other options are Tamil and Sinhla. I'm at work so I just checked and there are no hot keys set, so it's hard to understand how it could have changed if the language bar wasn't running. I also found that the input setting is per program, the top one, and not system wide, thank goodness. Please don't spend a lot of time on this as it's not likely that anybody else will manage to do that. As the reboot solved it I'm just curious as to how it happened. Thanks. Enter CambridgeBayWeather, waits for audience applause, not a sausage 04:32, 28 November 2009 (UTC)[reply]

Difference between Intel i7 CPUs

What is the (practical) difference in performance between Core i7-860 and Core i7-930? --Andreas Rejbrand (talk) 13:37, 27 November 2009 (UTC)[reply]

The i9xx series uses QuickPath Interconnect, which will yield higher performance to main memory and to peripheral devices. You can compare other specs at our Core i7 comparison article. I don't believe our article's un-sourced price estimate for the i930 - it is a higher performance CPU and will probably be more expensive than the i870. I also recall reading some rumors online that there will be serious reorganization of the i7 brand to re-label the i9xx series as high-end server processors "because they are too cheap" for their performance. (I can't find this forum post now). But you can look at the Xeon Harpertown E5xxx series (almost the same silicon, ~$1000 and up) and compare for yourself. Anyway, here's some more detail on the i920 for comparison. Nimur (talk) 15:24, 27 November 2009 (UTC)[reply]

Memory on the computer

Where do I look to find out how much memory I have and used on the computer? —Preceding unsigned comment added by 68.0.157.163 (talk) 22:46, 27 November 2009 (UTC)[reply]

What operating system? --LarryMac | Talk 22:50, 27 November 2009 (UTC)[reply]
If it's windows, task manager and process explorer will tell you. In task manager it's under the Performance tab, and in Process Explorer go View -> System Information —Preceding unsigned comment added by .isika (talkcontribs) 22:57, 27 November 2009 (UTC)[reply]
To find out how much memory is installed, simply press Win+Pause. --Andreas Rejbrand (talk) 00:12, 28 November 2009 (UTC)[reply]
Pause? Not OP but I have no pause on my keyboard. —Preceding unsigned comment added by 82.44.55.75 (talk) 00:35, 28 November 2009 (UTC)[reply]
Almost all keyboards have a Pause key; probably yours too. --Andreas Rejbrand (talk) 01:11, 28 November 2009 (UTC)[reply]
It is also labelled "Break". Astronaut (talk) 01:36, 28 November 2009 (UTC)[reply]
And it's usually on the far right of the very top row of keys. StuRat (talk) 07:36, 28 November 2009 (UTC)[reply]
...and if it's Linux (or Unix) you can type 'free' at the command prompt to get an overally summary or 'top' to get a task-by-task breakdown that updates as programs come and go and change their memory usage). SteveBaker (talk) 01:39, 29 November 2009 (UTC)[reply]
That was a good answer, Steve, overally. :-) StuRat (talk) 03:23, 30 November 2009 (UTC) [reply]

Closed Source and Open Source

Recently uninstalled Ubuntu again because having such a hard time with it. Now, I'm wondering is there any difference/s between closed source and open source. Since I already used Firebox and Wikipedia.

Think thats it for now.

Thank you, in advance, again. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 23:07, 27 November 2009 (UTC)[reply]

Well, the obvious difference is that open source releases its source code. Do you mean in performance? Thanks, gENIUS101 00:03, 28 November 2009 (UTC)[reply]


Closed source = recipe is secret, Open source = recipe is shared and published for all to see. —Preceding unsigned comment added by 82.44.55.75 (talk) 00:09, 28 November 2009 (UTC)[reply]

Thanx Basically, in performance and etc as well.--Jessica A Bruno 01:14, 28 November 2009 (UTC) —Preceding unsigned comment added by Jessicaabruno (talkcontribs)

Both have strengths and weaknesses. A strength of open source is a quality of the widely used applications (Linux, Emacs, gcc, Firefox...). A weakness is integration and polish, although Ubuntu has done a pretty good job in that respect. In general, most internet infrastructure is open source, and much more robust than the corresponding closed source programs. Personally, I use closed source for stuff that does not matter (iTunes, DVD player, Safari), and open source for stuff that does matter to me (emacs, gcc, LaTeX, make, Python), first because I don't want my work to be hostage to any particular software maker, and secondly because those tools simply work better for me. --Stephan Schulz (talk) 02:05, 28 November 2009 (UTC)[reply]

Thanx for the latest response. Interesting because how both of them work anyway.--Jessica A Bruno 17:56, 28 November 2009 (UTC) —Preceding unsigned comment added by Jessicaabruno (talkcontribs)

Good lord, the Reference Desk has lost its purpose in life: to provide references. See the articles open source and closed source. Tempshill (talk) 18:47, 28 November 2009 (UTC)[reply]
It is also worth mentioning that open or closed source is related to the licensing issue. The corrollary of open- and closed- sourecodes are free software and proprietary software (roughly speaking). Since Jessica A. Bruno (the OP) is probably not a programmer, this is most likely a more useful categorization than "open-source" or "closed-source". Usually, only programmers need access to the source-code; end-users want to know whether their programs are free and free. Nimur (talk) 22:50, 28 November 2009 (UTC)[reply]

Sorry, tempshill for all the trouble that I caused here. Understand were you are coming from on this.

Thanx, Nimur for your answer to my question here.--Jessica A Bruno 22:09, 30 November 2009 (UTC)

Cropping an image

I've officially gone stark raving mad over this now. My mom got me a digital photo frame for Eid al-Adha, and I've been messing with it for over an hour. I can get over the fact that it only supports JPEG. I can even get over its rather limited settings and options. But what I can not get over is its aspect ratio.

All 12 of its sample pictures are sized 480x234. That's 480 wide, and 234 tall. And yes, at ~2.051:1, that is a nonsense aspect ratio. I doubt that I have any images in the 1000+ on my computer that quite match that specification. The obvious solution is to crop some copies of images, but there rises a difficulty: I don't know of any programs that can do so appropriately. Sure, Windows Live Photo Gallery can crop images, but only at fixed aspect ratios / no aspect ratio at all. I've perused through search engine results to no avail. Is there any program I can use to crop a photo according to a certain aspect ratio?--The Ninth Bright Shiner 23:41, 27 November 2009 (UTC)[reply]

Yes, GIMP will. To do it, open the image, click on the rectangular select tool, in the toolbox click the "fixed" radiobox and select "aspect ratio" (as opposed to width, size, height) in the adjacent dropdown box. Put the ratio you calculated above into the field below that (with a colon delimiter) and you're done. Then using the select tool in the picture's window makes for a select that's always of that ratio; you can drag it around to move it, and drag its sides and corners, and it'll always be of the desired ratio. -- Finlay McWalterTalk 00:06, 28 November 2009 (UTC)[reply]
Oops, I should add: to actually crop the image to the box you've selected, it's simply image->crop-to-selection -- Finlay McWalterTalk 00:08, 28 November 2009 (UTC)[reply]
And it's sensible to then scale the image in GIMP to the size your frame wants (both to save storage space on the flash card and because GIMP's image resizer is very likely to be superior to that in the frame's firmware); you simply do image->scale-image in GIMP. -- Finlay McWalterTalk 00:15, 28 November 2009 (UTC)[reply]
Oh, and yet another thing (you can tell I've one exactly the task you're asking about): when you scale the image, click the little chain that spans the width and height boxes, and make sure the chain is broken. That chain constrains the scale to the current proportion, but (in part because of that weird ratio you're having to use) you might end up with images that would naturally scale to 480x233 rather than 480x234 (due to rounding issues). This way, again, you're using GIMP's excellent scaling algorithm, and not risking the kludgetastic one in the frame seeing a 480x233 pixel image and doing a bad job of scaling it, for want of that one row. -- Finlay McWalterTalk 00:27, 28 November 2009 (UTC)[reply]
Just curious: What happens if you give the frame an image in a more normal size and aspect ratio, say: 1280 x 1024? Astronaut (talk) 01:40, 28 November 2009 (UTC)[reply]
Nearly every digital frame on the market will show an image in any size/ratio, but there will be a significant delay while it resizes the photo. For that reason, it is preferred to manually resize and crop photos before putting them on digital frames. -- kainaw 03:22, 28 November 2009 (UTC)[reply]
Actually, in my frame's case, it tended to zoom in and cut off portions of the picture. Granted, that's what I'm doing manually, but I'm doing it properly. And by the way, GIMP works wonderfully. Thank you!--The Ninth Bright Shiner 03:35, 28 November 2009 (UTC)[reply]

November 28

Possible to add WiFi "n" to PS3 via USB dongle?

Is it possible to add 802.11n WiFi to the Playstation 3 with one of those USB WiFi .11n dongles? I'm noticing streaming Netflix is having a hard time keeping up with HD shows over PS3's built-in "g" WiFi. --68.103.143.23 (talk) 00:20, 28 November 2009 (UTC)[reply]

You cannot install the driver for the wifi adapter in the PS3 OS. If you simply run Linux on your PS3 for some reason, you could install the driver there. The common method to get 802.11n is to get a bridge configured to talk "n" instead of "g" to your router. Then, connect the bridge via the wired ethernet into the back of the PS3. So, the PS3 will have 1GB speed to the bridge and the bridge will be limited to "n". -- kainaw 03:19, 28 November 2009 (UTC)[reply]
How fast is your Internet connection? It's just that 802.11g is 54Mb/s (although realistically you probably don't get more than 30Mb/s), but even the 30Mb/s is still likely to be faster than your Internet connection which is probably the limiting factor here. If it is then adding 802.11n (which I don't see is actually an option anyway) just wouldn't do any difference. The only other thing I can think of is that perhaps the HD video is encoded with more bandwidth than the PS3 can handle, but I don't know what it can/can't handle with regards to that, so it's just another possibility. ZX81 talk 03:45, 28 November 2009 (UTC)[reply]
The PS3 handles the HD codec just fine. I've watched 1080p movies on the PS3 and it doesn't glitch. -- kainaw 04:06, 28 November 2009 (UTC)[reply]
I'm not doubting the codec won't work (original poster already verified that it does), but rather how high a bitrate for the video the PS3 can cope with. The higher the bitrate the more data it would have to process at once until the point it can't process anymore, but I honestly believe the problem is more to do with the Internet connectivity than the bitrate. ZX81 talk 04:32, 28 November 2009 (UTC)[reply]

How to create a new website

how to create a new website? —Preceding unsigned comment added by Sureshsivam (talkcontribs) 02:10, 28 November 2009 (UTC)[reply]

Learn HTML and create a website or use a WYSIWYG editor and create a website. -- kainaw 03:12, 28 November 2009 (UTC)[reply]
and after that get some hosting for the site, so people can actually visit it, see web host —Preceding unsigned comment added by 82.44.55.75 (talk) 10:59, 28 November 2009 (UTC)[reply]
Or, use Google Sites or one of the other free services, which don't give you as much flexibility, but are probably a better (and certainly cheaper) way for you to get started and see how much you want to take on. Tempshill (talk) 18:45, 28 November 2009 (UTC)[reply]
You might also like to register a domain name for your site SKYFUDREAMCLOUDS - TALK // CONTRIBUTIONS 23:26, 28 November 2009 (UTC)[reply]

Cross-configure Ubuntu Server

My desktop computer's video card died today, and it has no onboard video. I'd like to convert it to a headless Ubuntu Server with SSH. I plan to buy a USB external dock for the hard drive and connect it to my laptop for the conversion. Is an installer available that will let me do this? If not, where can I learn how to install it manually? The desktop's CPU is a Pentium 4 (i386 architecture), and the laptop's is a Core 2 Duo (amd64 architecture), so if I use chroot with apt-get, I may need to use architecture emulation as well. NeonMerlin 03:19, 28 November 2009 (UTC)[reply]

Best way would be to install the os through the laptop on the HD from the enclosure(make sure you get the 32bit, it should run on both computers). Then, when you are finished installing and you have ssh ready and you're sure it can boot up properly, you can move it to your headless comp. (tip, take the native(the one inside) hd out of the laptop, so the only hd the os knows about is the one in the enclosure). Though, i strongly suggest you get a crappy video card from somewhere, they're throwing them away these days(go to a local school, and ask the admin for some); it should help you diagnose problems like in BIOS, network problems(which you can't diagnose otherwise because the ssh won't work) and so on. 129.97.226.160 (talk) 07:47, 28 November 2009 (UTC)[reply]

files and their shortcuts

How can we keep the shortcut to files pointing to the same file even if it is moved to another place? —Preceding unsigned comment added by 113.199.159.177 (talk) 05:33, 28 November 2009 (UTC)[reply]

You didn't say what operating system you are using. Here's a couple of methods under Unix/Linux:
1) Add a new link pointing from the old location to the new one. Now you have a link pointing to a link pointing to the file, but it still works the same. I'm not sure if there's a limit on how many levels of links you can have.
2) Use relative directories containing logical file names. So, if the file is in $HOME/bin, and $HOME = /usr/adams, and you then change it to /usr/baker, then, as long as the link points to $HOME/bin (as opposed to /use/adams/bin), then it will follow the file to the new location automatically. StuRat (talk) 07:30, 28 November 2009 (UTC)[reply]

I am using windows xp —Preceding unsigned comment added by 113.199.174.230 (talk) 09:06, 28 November 2009 (UTC)[reply]

If you click on a shortcut where the target has moved, Windows will automatically look for the target. 121.72.215.173 (talk) 10:12, 28 November 2009 (UTC)[reply]
Or right-click the shortcut, select "properties", and change the target in the dialog that appears. --NorwegianBlue talk 14:33, 28 November 2009 (UTC)[reply]

Can I play these Games with This computer?

Dell Studio 1555:

  • Intel Core 2 Duo P7450 2.13GHz, 1066Mhz, 3M L2 Cache
  • RAM 4GB, DDR2, 800MHz
  • 512MB ATI Mobility Radeon HD 4570
  • 500G 7200RPM SATA Hard Drive

Call of Duty World at War 1 and 2, Medieval: Total War 2, Battlefield: Modern Combat (newest one)and Grand Theft Auto 4? Thanks 99.240.194.178 (talk) 06:35, 28 November 2009 (UTC)[reply]

Call of Duty: World at War#Development Medieval II: Total War Grand Theft Auto IV#Windows version Learn to search please kthx.121.72.215.173 (talk) 10:11, 28 November 2009 (UTC)[reply]
It would help to know the OS (Operating System) running on your PC. A version of Windows perhaps? Cuddlyable3 (talk) 13:04, 28 November 2009 (UTC)[reply]
Here's a website that figures it out for you (with a java app). Can You Run It?. --Mark PEA (talk) 17:33, 28 November 2009 (UTC)[reply]
Short answer: yes, but on low details, except for GTA4 (that thing is even worse than Crysis). --antilivedT | C | G 06:17, 29 November 2009 (UTC)[reply]

Google mail SMS account verification

I've tried several times to create a google email account, and each time it has asked me for a phone number to "activate" the account via sms (see Gmail#Requirement_for_mobile_phone_number). I do not have a mobile phone, nor do I have any access whatsoever to mobile phones, and I cannot borrow someone elses phone because I live alone, don't work and have no friends. Is there a workaround or alternative? —Preceding unsigned comment added by 82.44.55.75 (talk) 11:11, 28 November 2009 (UTC)[reply]

I had genuinely considered offering to help earlier in the day when I first read this post (although don't live in the same country so don't know if that would work) but considering what I realised on WT:RD I've decided I definitely won't and in fact am quite glad I didn't. A word of advice, you may find in that when you make yourself a social outcast by your actions, this isn't an uncommon reaction. And I'm not surprised by your personal description. Nil Einne (talk) 06:48, 29 November 2009 (UTC)[reply]
What? You're saying you know the answer to my question but won't help because you have a personal grudge against me? That's circling very close to a personal attack. What exactly have I done to offend you? —Preceding unsigned comment added by 82.44.55.75 (talk) 10:59, 29 November 2009 (UTC)[reply]

SMS memory premature ejaculation

I love my SMS memory stick because there couldn't be an easier way to carry gigabytes in my pocket and to download them to anyone's computer. BUT recently I mangled all the files on the stick by pulling it out of the PC without clicking through the "Safely Remove Hardware" steps (in Vista). Can anyone tell me what is so difficult about making the OS write file(s) to the USB device properly which would eliminate the danger of premature ejaculation? Cuddlyable3 (talk) 13:02, 28 November 2009 (UTC)[reply]

It seems to me to be entirely possibly to make a USB pen drive which preserves all of the files, except for the one it's currently writing, when you yank it out. For that file, there's no solution to only having part of it copied, but the drive should at least be able to identify it as such by setting a "copy in progress" flag, which is not cleared until it is fully copied.
Now, for some applications you may want to have multiply files open at once, but I still think only one can be written to at a time, even in that case. Perhaps the other files could have a "file open" flag set, which is less serious than the "copy in progress" flag. In many cases such a file would still be usable.
Database operations are especially complex, as many tables need to be kept in sink and pulling the storage device out mid-update might mess that up. However, there are already software solutions in place for that, such as "transactions" which can be rolled back if incomplete.
The other solution is to make USB flash drives like CDs and DVDs, which can only be ejected when everything is OK. That features is a major source of annoyance for many, though, as they often say "file in use" even when nothing seems like it should be in use. So, I prefer the way they work now over that. StuRat (talk) 14:49, 28 November 2009 (UTC)[reply]
USB is slow. Users will get pissed if they have to wait for writes to the USB stick. So, it goes to a buffer. To ensure the buffer is cleared before you pull the stick, you must safely remove the stick. So, the way to fix it is to remove the buffer. I know how to do it in Linux, but not in Windows. -- kainaw 16:42, 28 November 2009 (UTC)[reply]
Try right-click; Properties; Hardware; Properties; Policies; and turn off Write-Caching. --Phil Holmes (talk) 17:36, 28 November 2009 (UTC)[reply]
It is not quite trivial to ensure consistency even without caching. The problem is not the one file being written, the problem is that the structure of the file system itself is being modified, and may be in an inconsistent state. To ensure that this is consistent requires journaling or something similar, but that comes with an overhead in memory and performance. --Stephan Schulz (talk) 17:44, 28 November 2009 (UTC)[reply]

Switching from Windows (Microsoft) to Mac (Apple)

Is it worth it doing (see the subject line for more info)?

Basically, my previous questions on here have got me thinking about this. At the same time my dad made the switch a yr ago (February, 2010) and he loves it. Plus, he had his troubles with Windows as well. Somehow he gets the new one since he uses it for his business as well and not for me. Unfortunately its hard for him and other/s to justified getting any additional/s ones since from what I gather that Macs can be expensive then Windows. At the same my mom and me each have accounts on his computer. At least have some familiarity with them as well. Also, my mom and twin sister have iPhones and do love them. Will be getting my 1st iPhone in a wk to a mo from now and can't wait for it. Before we always relied on Windows then Macs. Expect our 1st computer was mac, but it wasn't great experience. Have used mac in some of my schooling yrs as well and don't really remember it.

Believes thats it for now. —Preceding unsigned comment added by Jessicaabruno (talkcontribs) 18:17, 28 November 2009 (UTC) --Jessica A Bruno 18:26, 28 November 2009 (UTC)[reply]

Unfortunately, without knowing much more about what you use your computer for, it's impossible for the Reference Desk to answer the common question of which is better. It is a very individual decision that has to do with your personal preference for the Windows interface vs. the Mac interface, your budget, and what applications you use on your computer, and which platform has a better version of these applications. Note that most iPhone users have PCs; you don't need a Mac for that. Tempshill (talk) 18:44, 28 November 2009 (UTC)[reply]

Thanx for your answer to my question here. Basically, I really just use it for emailing and internet then anything else.--Jessica A Bruno 22:29, 28 November 2009 (UTC) —Preceding unsigned comment added by Jessicaabruno (talkcontribs)

Why would you need a new thousand-dollar computer to send e-mails and surf the Internet? E-mail was invented in the 1960s and the Internet in the 1970s. Any computer can do those things. Most of the issues with Windows mentioned in those simplistic Apple ads have been fixed with Windows 7, by the way. Maybe you could just upgrade to Windows 7, instead? If your hardware isn't up to the task, then put some new parts in. You are correct that Macs are expensive. But it's not just the initial purchase price. Every few years, Apple changes their products, forcing you to buy a brand-new system. In 2005, they switched from the PowerPC CPU to Intel. In 2001, they switched from the classic Mac OS to a new one (Mac OS X). The same happened in the early 1990s with the PowerPC transition. You better have deep pockets.--Drknkn (talk) 22:40, 28 November 2009 (UTC)[reply]
OP should read the above as a troll. Anybody using any type of computer will want to buy a "brand-new system" a couple of times a decade or so to keep up with the latest and greatest. (Hopefully this won't be true in the future, but is certainly is true now.) Apple (or anybody else) doesn't make you upgrade to the latest systems when they come out. You just won't be able to run the fastest/fanciest stuff on an older computer, no matter what type it is. If you really just need to do email and web browsing, a 5 year old Mac or 5 year old Windows machine will probably work equally well. Staecker (talk) 23:39, 28 November 2009 (UTC)[reply]
No. Actually, anyone who is a sucker and a noob would want to buy a new computer every five years. I have parts in my computer from the 1990s. I only upgrade what I need. I can run MS-DOS programs on my Windows installation, by the way. Have fun booting into your classic mode to run your old apps. Was that worth the $1000 Apple tax?--Drknkn (talk) 01:33, 29 November 2009 (UTC)[reply]
Agreed that the above is just trolling. --Mr.98 (talk) 00:14, 29 November 2009 (UTC)[reply]
Thoughtful comment. But where did I insult anyone? You and "Staecker" seem to be the ones who are calling names. So, you're obviously both ignorant immature trolls. You'd rather make your parents pay $1000-plus dollars for a new computer than answer their basic questions? What kind of son are you?--Drknkn (talk) 01:33, 29 November 2009 (UTC)[reply]
I think that's uncalled for. The OP has had her question answered, let's just leave it here. Thanks, 99.241.68.194 (talk) 04:29, 29 November 2009 (UTC)[reply]

Thanx for your answer to my question here. Have to say your answer is interesting as well. At the same time I'm always looking for more answers to my question as well.--Jessica A Bruno 23:44, 28 November 2009 (UTC) —Preceding unsigned comment added by Jessicaabruno (talkcontribs)

I have found that people with less experience with computers do better with Macs. They are much more user-friendly than Windows systems—they "just work" 90% of the time. (That other 10% is mind-boggling difficult to resolve, though.) They can do almost all of the "business" functions you might want them to. There are a few programs that are Windows-only but most are cross-compatible. The hardware is more expensive, though, to be sure. On the other hand, if you consider your time valuable, I consider them to be a lot cheaper than Windows machines (to say the least of Linux machines), as you won't be worrying about viruses, spyware, or other things that plague Windows. My father switched to a Mac about two years ago and the number of "tech support" calls to me dropped to basically zero. I can't wait for my mother to switch. --Mr.98 (talk) 00:14, 29 November 2009 (UTC)[reply]
Actually, people with little Windows experience do better with Macs. Macs are fine for many beginners, but they are also very popular among computer scientists and UNIX diehards like me. There is a real UNIX under the hood, and many highly useful if not necessary pointy-clicky-friendly open source applications are either pre-installed or easy to install. I'd also take some issue with claims about the price. If you buy comparable hardware, the Mac is not that expensive - as I found out 2002, looking for a good Laptop for Linux, but ending up with a TiBook, as it was cheaper than comparable models from Toshiba, IBM, and even Gateway. Apple does not cater for the low-end market, and you can get sufficiently powerful computers much below the price of Apple's entry level models. But if you compare like with like, there is no significant premium. --Stephan Schulz (talk) 01:08, 29 November 2009 (UTC)[reply]
Well, I wasn't considering the high-end of the experience scale, just because it didn't seem relevant here.

If you really just use a computer for email and the web, then the advice some way above that any computer will do is pretty much right (although wrong in other ways). However, you might have trouble getting a decade-old computer to run the latest versions of Flash and so forth if these are necessary for you.

If you're concerned about malevolent software (what are often miscalled "viruses", etc), then some alternative to Windows is likely to be a bit of a relief, because writers of this software concentrate on Windows (thanks to its huge market share). That's not to say that you can't have a safe Windows setup or a vulnerable Mac (or other) setup.

If you prefer Mac OS X then it is no longer necessary to buy a Mac for the job; you (or a friend) can install it on various other computers (but certainly not any), resulting in what's called a "hackintosh". Apple will not be happy and will not give you any help.

Another option is Linux. According to legend, this is tremendously difficult to install, get used to, and run; if I said that it came installed ("preinstalled") on the dirt-cheap computer I'm contentedly running now, I might be accused of trolling, so I shall refrain from saying it. -- Hoary (talk) 04:56, 29 November 2009 (UTC)[reply]

(Re)Installing Linux is a lot easier than installing Windows IMO (although most people are shielded from the pain by buying prebuilt computers). Windows 7 removed many driver burdens but it's still nowhere near as easy to reinstate your old settings (mount my old home partition as /home in Linux and I'm done!). --antilivedT | C | G 06:13, 29 November 2009 (UTC)[reply]
(Nobody is going to accuse you of trolling for just stating how you feel about things in a reasonable way. We'll accuse you of trolling if you pepper your posts with insults and obvious non-sequiturs meant to provoke a response.) I know that a lot of people who already know how to use Linux find it pretty straightforward, but I think they on the whole underestimate the amount of skill it takes to use when something does not go right. I personally find Linux far too opaque for the casual computer user. Even my own forays into trying to install free software on the Unix aspects of my OS X machine have been overwhelmingly frustrating, and I am pretty savvy with computers. (Oh, to install this, you must first install these other programs, all of which require hours-long compilations from the source, which requires having the right version of the compiler, and nobody feels it necessary to spell out the specific steps necessary. And then, after all that, it still won't really work correctly, but if you want it to be different, then just become a computer programmer and fix it yourself, duh.) I really wouldn't recommend it to someone who just wants to have an "easy" computing experience. --Mr.98 (talk) 15:07, 29 November 2009 (UTC)[reply]
It should be mentioned that on an average linux desktop you never need to install anything from source. The package manager makes it possible to install and keep up-to-date the program you want and all its depencies without wandering to a project's homesite to download an installer. Eg ubuntu repositories have ~30,000 packages, all are digitally signed so you don't need to worry whether it's the real thing or something malicious. --194.197.235.240 (talk) 17:03, 29 November 2009 (UTC)[reply]
...and neither is it needed on the Mac. Install Fink, opt for apt-get, and nearly all of Debian is yours for a command line. --Stephan Schulz (talk) 18:12, 29 November 2009 (UTC)[reply]

Thanx for all the latest responses to my question. All of them are interesting as well.--Jessica A Bruno 19:58, 29 November 2009 (UTC) —Preceding unsigned comment added by Jessicaabruno (talkcontribs)

November 29

.emacs and OS X

When I'm using Emacs from the terminal in OS X (10.6), where does its .emacs file live? —Preceding unsigned comment added by 66.75.107.228 (talk) 02:21, 29 November 2009 (UTC)[reply]

It can be at ~/.emacs, ~/.emacs.el, or ~/.emacs.d/init.el (for the ~, see home directory). See the Info node (emacs) Find Init (which should be available locally as well as online). Of course, you may not have one; OS X surely doesn't put one there for you, and Emacs won't unless you use its Customize feature or "Options → Save Options". --Tardis (talk) 08:35, 29 November 2009 (UTC)[reply]

Monitor

Do they still make computers with CRT monitors? jc iindyysgvxc (my contributions) 04:21, 29 November 2009 (UTC)[reply]

Note that no-one manufactures computers "with" monitors. A monitor is a separate device, although usually they SOLD "with" monitors. That would be like asking "do they still make DVD players with CRT TVs?" So if you're asking "do they still make CRT monitors?" then possibly, somewhere in the world. New CRT TV's are still being manufactured and sold, so I don't see why not monitors too. Zunaidis there when you need him — now he needs you 06:21, 29 November 2009 (UTC)[reply]
I wouldn't be surprised if there are some military designs with them from years and years ago and a procurement order has only now been put out after loads of testing and paperwork for ruggedized CRT monitors and they are just about to be manufactured and delivered for the next twenty years at a hundred times the price of an LCD moitor. But apart from ancient designs where changing the paperwork would be just too much work I can't see any point. The space shuttle still runs on something pathetic with one megabyte of store I believe but it does the job and noone feels like replacing it. The Russians used to work till recently with just six kilobytes until they decided to upgrade and that probably caused one of their rockets to crash. Dmcq (talk) 09:25, 29 November 2009 (UTC)[reply]
This is a tangent, but as noted at Space Shuttle#Flight systems, the flight control computers' goal is fail-safe reliability, so it's not that "noone feels like replacing it", but that by now the software is probably close to bug-free, and it surely cost years and several million dollars to get there. Not worth tampering with a system that's close to the apex of reliability for software just so the astronauts can play MP3s on the flight control computers. Tempshill (talk) 16:58, 29 November 2009 (UTC)[reply]

Zunaid seemingly hasn't seen the vast number of Apple iMac computers that are, to the end consumer, computers built into monitors. These have been around for maybe 10 years now - mostly in LCD format, but they start off being CRT (their eMac and their original iMac design). Many other firms do this as well - Sony Vaio have plenty like this. This is being a bit mean though, as Zunaid is right - the question is more 'are CRT monitors available new?' because most of the time the monitor itself is not part of the 'pc' per-se, it's just something sold bundled with it. The answer is - yes there are plenty of CRTs still available (at work so can't provide links but google search and you'll find 'em). 10:42, 29 November 2009 (UTC) —Preceding unsigned comment added by 194.221.133.226 (talk)

According to [5] 'Philips spokesman Joon Knapen said the CRT screens "are still made for a couple of emerging markets, but the volumes produced are very small."'. This was from about 3 days ago but of course is referring to Philips who are a high end brand and is likely primarily referring to TVs. [6] "Since April 2008, the exports of China’s CRT TVs and colorful tubes have been increased gradually compared to the same period of last year; especially the export achievement of upstream color kinescope enterprises represented by Rainbow and Beijing Matsushita, their export situation is gratifying with growth exceeded 20%. Currently, there are 20 manufacturers of CRT complete appliance in the world. 60% -70% of the CRT TVs are produced in China and over 80% of color picture tubes are manufactured in China too." Cathode ray tube#The future of CRT technology also has useful info too. While these are all largely referring to TVs, I don't see any reason to think there will be CRT TVs but no CRT monitors in other words 194.221 is definitely right. Of course non new monitors may also be refurbished if necessary and used too and I think this is quite common in the developing world. (In much of the developed world you can probably buy a good CRT for a very low price from auction sites and the like, e.g. I got a 19" Dell Triniton for about NZ$12 IIRC Nil Einne (talk) 23:36, 29 November 2009 (UTC)[reply]

What's the problem with DeviantArt?

This has been happening for quite some time now. Some times, when I'm viewing DeviantArt, viewing a picture takes me to a phishing or malware site instead. Luckily I spot this immediately and click the "back" button before the content even starts loading. On the next try, everything works fine. This happens once every few days or so. It happens only with DeviantArt, not with any other website. (I don't even go to any shady porn or Viagra sites advertised in spam, just reputable websites.) What's the problem with DeviantArt? Has it been hijacked somehow? JIP | Talk 11:22, 29 November 2009 (UTC)[reply]

Probably there ad server is directing you to other sites and they get payed for it. —Preceding unsigned comment added by 82.44.55.75 (talk) 11:24, 29 November 2009 (UTC)[reply]
If you press ALT + V in Internet Explorer and then go to Webpage Privacy Policy, you will be shown a list of sites that are posting content on Deviant Art. You can block these sites from doing certain things (like redirecting your browser) by pressing ALT + T and clicking on Internet Options, then Security, Restricted sites, Sites, and adding the malicious site to the list. I also use this technique to block Flash ads. I experienced a malicious redirect a while ago on the New York Times's web site. These ads are usually hosted on the advertiser's site and not the site you visit. So, the site you visit has no control over what they do. The ads are often web pages (not just images), so they can run JavaScripts and other dangerous things.--Drknkn (talk) 12:00, 29 November 2009 (UTC)[reply]
I would definitely report such instances to DeviantArt immediately, giving as much information as possible.--Shantavira|feed me 16:16, 29 November 2009 (UTC)[reply]
If the redirects are coming from the ads, then that's a bit difficult. I don't pay attention to ads, so I don't know which ads are displayed on the page unless I am specifically trying to find out. And when I come back from a malicious redirect to the real DeviantArt site, the ad should already have been replaced by a legitimate ad, shouldn't it? JIP | Talk 16:42, 29 November 2009 (UTC)[reply]
You could disable JavaScript temporarily. In IE, you'd press ALT + T, then go to Internet Options --> Security --> Custom Level and scroll down to Scripting and disable "active scripting." Then, view the privacy report and block all sites except Deviant Art. Reload the page and repeat. Then, re-enable JavaScript. While you're in the security box, also disable "META REFRESH" under Miscellaneous. Most of the time, redirects are done via either JavaScript or a meta refresh.--Drknkn (talk) 20:44, 29 November 2009 (UTC)[reply]

Any way to password protect an entire external hard drive?

It's just plug and play and not secure. Is there any way to password protect the entire drive conveniently? If you password protect a single folder containing all data on the drive, do you have to enter that password every time you add or change data on the volume? (Using Windows Vista)... --Damriteido (talk) 16:00, 29 November 2009 (UTC)[reply]

You can encrypt an entire drive using TrueCrypt, although it's probably better to just create a TrueCrypt volume on the drive as it allows the drive to still be used for other things. Or you could store your files you want protected in an .zip file and use its password protect function. As far as I know, most encrypt programs will remember your session and won't ask for the password for each data change, until you exit the program or dismount the volume. —Preceding unsigned comment added by .isika (talkcontribs) 16:08, 29 November 2009 (UTC)[reply]
I agree with .isika. BitLocker Drive Encryption and TrueCrypt are probably the articles you'll want to refer to. Tempshill (talk) 16:53, 29 November 2009 (UTC)[reply]

Other use of "Interstitial"

Hi, I've checked Interstitial webpage and searched the archive for keyword "interstitial", but I can't find what I'm looking for - a system I've seen years ago, of which I can't remember the name. It was a proxy/whitelist system that would offer an interstitial page when the user hit a non-whitelisted page, instead of an "access denied" error page. The interstitial told the user that proceeding to that page *will* be logged, and that if s/he intends to access it, it might be wise to fill out the comment box below (say, you're in a bank's trading department, and CNN reports that sex dot com is going to be bought by foo dot com - that would be a legitimate reason for a short visit to sex dot com, which would usually be blocked by default). Optionally, it could be configured not to let the user access the page until an administrator had reviewed the site and the comment - in that case, the URL and the comment were forwarded to a mailbox set up for that purpose. I don't even remember if that was some sort of plugin for Squid, or if it was a completely different piece of software altogether. :-( Does this sound familiar to someone? -- 78.43.93.25 (talk) 17:42, 29 November 2009 (UTC)[reply]

How to compile GNU stuff

Hello! How do I take the sources that I download from the GNU project and compile them so they'll run on my Windows Vista 64-bit computer? A lot of the Windows binaries I find only work on 32-bit machines, and I know nothing about compiling to be able to create a 64-bit executable myself. Thank you!--el Aprel (facta-facienda) 20:11, 29 November 2009 (UTC)[reply]

MinGW has a 64 bit version (although I don't know how mature it is). That'll give you a basic gcc-win64 toolchain which should be enough to get other stuff building. -- Finlay McWalterTalk 20:21, 29 November 2009 (UTC)[reply]
Okay, MinGW-64 has given me some command-line executable compilers like g++, c++, and gcc, but I'm not sure what files I'm supposed to compile from open sources. I've downloaded the sources for MPlayer, for example, which contain a bunch of .c and .h files, but even the README isn't that clear on what to compile. Do I try to compile the whole directory? Is there a certain "main" file I should look for, pass it into the compiler, and that will take care of it?--el Aprel (facta-facienda) 21:35, 29 November 2009 (UTC)[reply]
For clarity, there's nothing stopping you running Windows x32 apps, including MPlayer and other MinGW compiled apps on any version of Windows x64 as all x64 versions of Windows including Windows XP x64, Windows Server 2003 x64, Windows Vista x64, Windows Server 2008 x64, Windows 7 x64, Windows Server 2008 R2 x64 (although it's an optional component there) and almost definitely Windows 8 (which is likely to be only available in x64 versions) include WoW64. While the app will be able to use more then 4GB and in some cases may be faster, this is not guaranteed particularly in cases like I describe later (IA64 versions of Windows are obviously a different case).
For a few things like drivers and perhaps which require very low level or complete interaction, you do need native x64 versions but that's obviously not the case here. I've been using Windows x64 since the XP days (late 2005 I think) and excluding drivers and things with them (including hardware monitoring or overclocking stuff, firewalls, CD mounting utilities and Acrobat) and on access antivirus software none of which is really a problem nowadays (except for the requirement for signed drivers on Vista x64 and Windows 7 x64 which can be turned off if really necessary), the only apps I've ever had problems with have been one which refuse to install (largely games) which also can happen on Vista x32 and two or so games (cheapish adventure games so not something you encounter a lot) which didn't work properly on Vista x64 which I think I never worked out whether because they didn't like Vista or didn't like x64.
Oh and I also know of a friend had problems with SolidWorks (2008 Student Edition) because of a variety of issues which I can't remember I think included strange stuff like the x64 version not running on Vista x64 but only XP x64, the x32 version not running on any x64 OSes or perhaps not without a patch; which I don't quite know the cause of but may be because of their activation/licensing modules or simply because they don't support the platform and therefore have decided to refuse to let you to run it in case it doesn't work. Even this is mostly fixed in the 2009 version I think and the 2008 version has patches, but the Student Editions can be behind the normal versions and they don't appear to support the Student Editions with patches.
Definitely I've never had any problems with any GNU apps I've tried including several of those in cygwin nor with MPlayer (which isn't a GNU app AFAIK even if it's released under the GPL) or VLC. AFAIK, things still aren't that good on the free software Windows x64 compiler front part of the reason I believe for the lack of Windows x64 builds for most of FLOSS software. You may be able to get it to work, but unless you know what you're doing I wouldn't recommend it.
You can get unofficial Windows x64 versions of some codecs and apps (e.g. Firefox) but you should bear in mind that for example for browsers you need x64 plugins which Windows are still somewhat unavailable (a Java plugin is finally available as of late last year, Flash is still not available AFAIK) and for DirectShow codecs you need a x64 media player or app to use them and you may not gain much of an advantage in terms of speed because many have a lot of coded low level optimisations which will need to be reimplemented before you really gain anything.
If you plan to regularly encode a lot of stuff and believe you will gain a speed version with x64 versions (hint: look for benchmarks or even if you want to use mplayer, try ffdshow, x264 or other DirectShows codecs and/or apps which you can find already compiled x64 versions for and use these to test, they by and large the same codecs mplayer uses internally) then it's obviously worth it. Or you have a lot of memory (over 4GB, presuming the app is large address aware and if it isn't it's probably easy to either ask someone or you yourself compile a x32 large address aware version rather then fool around with x64 version) and want to use it with the app ditto. And obviously if the app does have a Windows x64 version then you probably should use. But otherwise I somewhat doubt it's worth worrying about. (In the older days, I used to seek out all the unofficial x64 versions of apps I could find but I've somewhat given up on that.) In particularly, if you only want to use Mplayer as a player it's unlikely to be worth it.
Obviously it's up to you what you do, and if you still want to try it I won't try to further discourage you I'm just suggesting you think carefully about whether it's really worth your time since it doesn't sound like you know what you're doing (this is intended in a negative way, I don't either) and from my experience compiling open sources apps particularly those not natively for Windows on Windows can be tricky as you often have to manually seek out the libraries and then may have to do various things to get it to work or it may even just not worked if no one regularly compiles it on Windows and I'm guessing trying to do it for x64 is going to be even more annoying.
Nil Einne (talk) 23:04, 29 November 2009 (UTC)[reply]
MPlayer isn't GNU software (which is important, as GNU software tends to all be built the same way, but other projects can be more variable). In general you run the configure script, which chunters away for a bit and then emits a makefile, and you then typically run make and then make install. That all works because of autoconf (which makes the configure script, but which I think you don't need to run the build); autoconf tries to build a platform-neutral build system. But MinGW (or cygwin) for that matter aren't magical compatability layers that allow any Unix(ish) software to be built on Windows and run. They don't do anything for UI stuff, media stuff, and lots of other things that aren't within the (rather narrow) amibit of POSIX (and a few other things) that MinGW and Cygwin implement on Windows. For stuff like that, I'd check the site for the specific project; if they don't have a build for that platform, they probably don't have an official one. In that event I'd ask on the project's mailing list. And it's not at all a safe assumption to think that a C or C++ program will run faster if compiled to 64 bit rather than 32: the significant expansion of code, data, and stack size will (roughly) half the efficiency of the processor's cache; 64 bit applicatations really only makes sense if important parts of the code have been written to properly take advantage of it. -- Finlay McWalterTalk 23:34, 29 November 2009 (UTC)[reply]

November 30

How the algorithm to fade from one image to another is implemented

Is it usually done by

1)For each pass in a loop depending on how quickly the fade is to be, completely replace every nth pixel of image1 with the corresponding pixel of image2 and decrease n so slightly more pixels will be replaced the next time around until all pixels have been replaced.

2)For each pass in a loop depending on how quickly the fade is to be, modify every single pixel value according to a calculated shift so every image1 pixel value transforms into its corresponding pixel2 value with many values belonging to neither image occurring during the transformation.

3)Something else I didn't think of.

If it's either of the two I could think of, I'm betting it's #2 since #1 would probably appear spotty even if the resolution was sharp, but it (#2) sure seems like a lot of calculation looping through every pixel. Then again, I know computers are just very fast. 71.161.61.41 (talk) 00:10, 30 November 2009 (UTC)[reply]

Normally, it is done by completely overlaying one image on top of another image. Then, through a loop, change the opacity of the top image until it is invisible and only the bottom image is visible. The end result is that the video output will average each pixel of both images together based on opacity of each image - but the user programming the fade isn't doing that. The person who programmed the video driver (or similar) does that part. -- kainaw 00:18, 30 November 2009 (UTC)[reply]
Yes, #2. What you're doing is generally linear interpolation (although there might be circumstances where you'd flatten the two ends of the line horizontally a bit, to give the effect a "soft landing"). Yes, it's a lot of bit shovelling, but you're right, modern CPUs are way fast. This process is generally called alpha blending (with you changing the alpha value to make the fade effect). Some very resource-constrained platforms to perform alpha blending using the #1 method you describe - Quake 2 did this (if you didn't have a decent video card) and called it "stipple alpha". -- Finlay McWalterTalk 00:21, 30 November 2009 (UTC)[reply]
It's irrelevent how fast your CPU is - blending is best handled with shader code in the GPU. Linear interpolation happens at ungodly speeds in the GPU. In order to get a nice-looking cross-fade, you generally want to ramp the alpha of one image against the other with a smoothed off ramp - not a straight linear ramp. I generally use a sine-wave shape to vary the alpha over time. What that does is (a) look smoother because there is no abrupt onset and ending of the fading and (b) it gets over the 'confusing' part where the two images are roughly equally represented in less time...it just looks way better. SteveBaker (talk) 02:31, 30 November 2009 (UTC)[reply]

Refilling an HP printer ink cartridge

I am very experienced in refilling black ink cartridges. All the instructions I've seen for refilling an HP45 cartridge say that you should make a hole underneath the cartridge, where it looks as if the original factory filling hole was. Is there any good reason not to ignore this and make the hole above the cartridge, where the ink is less likely to leak out and filling is so much easier? The Lexmark printer I previously used had cartridges where it was easy to remove the loose top and pour ink onto the sponge inside, but does the HP45 have - I speculate - some sort of siphon arrangement that requires an air tight seal? Without destroying a cartridge, does anyone know what the HP45 would look like if cut in half? 78.147.183.186 (talk) 00:55, 30 November 2009 (UTC)[reply]

One reason to fill that cartridge from the bottom is that you already have a hole there, rather than needing to drill one. Of course, leaking is possible in that location, but maybe that's why they put the hole on the bottom, to discourage home ink refills. StuRat (talk) 03:09, 30 November 2009 (UTC)[reply]

Chatzilla DCC automation

I use Firefox with Chatzilla 0.9.85. On some channels, public DCC file servers accept requests with the syntax

!<server username> <filename>

Can I set up a script so that if I type a request with this syntax in Chatzilla, it will automatically accept any offer from that user to send a file with that name? Also, can I set one up so that if certain usernames (who are search bots) offer .txt.zip files, they will automatically be accepted and unzipped and their contents displayed in the chat tab? NeonMerlin 02:08, 30 November 2009 (UTC)[reply]

Uniloc Patent

Does the Uniloc patent cover product activation in general or product activation with a key? Either both seem very broad to me and cover basic things that don't seem patentable. Although I can see a specific method of product activation using a key with a couple different protocol steps as patentable. --Melab±1 02:10, 30 November 2009 (UTC)[reply]

You can read the patent here. SteveBaker (talk) 02:16, 30 November 2009 (UTC)[reply]

Fractional bit of entropy from /dev/random

In Linux, is it possible to get less than one bit of entropy at a time from /dev/random, in the form of a bit whose probability of being 1 is something other than 0.5? (For example, one could then simulate a six-sided die using a first bit with probability 1/3 of 1 and 1 to 2 bits with probability 1/2 of 1, and then interpret 000 as 1, 001 as 2, 010 as 3, 011 as 4, 10 as 5 and 11 as 6.) NeonMerlin 02:54, 30 November 2009 (UTC)[reply]

I do not use Linux, but couldnt you just use /dev/random three times to get three bits, giving you eight different numbers with an equal probability? If the number obtained was seven or eight then re-do the proceedure. I recall it is possible to get a Guassian distribution for example by adding (I think) several instances of the more usual 0 to 1 probability function, due to the central limit theorem, and you should be able to do this with /dev/random as well. For example if you added /dev/random 100 times you would get a mean of 50 - cannot remember the formula for the standard deviation. 78.146.171.75 (talk) 11:18, 30 November 2009 (UTC)[reply]
Disclaimer: I haven't the faintest idea how /dev/random/ works. However, I was under the impression that it behaves like a file. If so, reading individual bits would be impractical anyways. As 78.x.x.x indicates, a common algorithm is to use a modulo operation to obtain a number in the desired range, discarding outputs from the random number generator that would result in an uneven distribution. decltype (talk) 11:35, 30 November 2009 (UTC)[reply]
You cannot read "less than one bit" from /dev/random. Obviously, a bit is as small as it gets. Further, you will have difficulty reading less than a byte (8 bits). So, your best bet is to read one byte and mod by 6. That will randomly give you 0-5. Add one to the result to get 1-6. If your problem is that /dev/random is blocking, use /dev/urandom. It is slightly (very slightly) less random, but won't block. -- kainaw 13:14, 30 November 2009 (UTC)[reply]
Yes, but as noted above, it is important that you then discard outputs that result in an uneven distribution (252-255 inclusive). Otherwise your die will have a lower probability of getting a 5 or 6. decltype (talk) 13:24, 30 November 2009 (UTC)[reply]
The idea is to avoid wasting more entropy than is needed to generate the number, in cases where entropy is in short supply. The above method, if it's to be unbiased, will spend slightly more than 8 bits of entropy on average (since 252-255 will have to be rerolled), while a 6-sided die is theoretically (and on average by the method I suggested) only 2.585 bits. NeonMerlin 13:30, 30 November 2009 (UTC)[reply]
In that case, I would suggest generating multiple dice rolls from a single read of random bits, and buffer the results. To generate five rolls with regular six-sided dice, you can get away with thirteen bits, which is close to the optimal 12.925. If you can find a power of 6 that is closer to a power of two, you could do even better, but that would require a large number of bits and a lot of computation. Even generating three rolls from a byte would be an improvement (generate a number 0-215 and discard the rest). decltype (talk) 14:04, 30 November 2009 (UTC)[reply]

Regarding SGML

I am working in a SGML file using EPIC editor software recently I have faced a problem while trying to print the document. It shows some error messages 1. some unrecogonized characters were present and 2. Invalid revision Bar point

I cant able to rectify this error if any one have solution for this are welcome to give their solutions.


Thanks and regards

DINESH KUMAR B —Preceding unsigned comment added by Dineshbkumar06 (talkcontribs) 04:14, 30 November 2009 (UTC)[reply]

Where Do The Aquamacs .el Files Live?

Hi, I’m just getting started with Aquamacs on OS X 10.6 and I’m not sure where I’m supposed to add customizations. For example, I’m trying to install a markdown mode (http://jblevins.org/projects/markdown-mode/) but I don’t know where to put the actual .el file. Also, I’m just putting my .emacs files in my home directory. Is that the best place for it? —Preceding unsigned comment added by 72.234.155.128 (talk) 04:31, 30 November 2009 (UTC)[reply]

PHP - database to form

For existing records in a database that are input via form, to allow an authorized user to edit those records, could I:

1. make an edit button run a query on that record's full database entry and return all current values
2. slide those values (& nulls or no datas) into the original data entry form as the default values
3. have this edit form trigger an UPDATE SET query

and be done with it? Is it really that simple? I'm wondering about records that have database values which have never been entered - so not nulls, just no data - how would they return in the query array, just empty fields? Does anyone see a problem with this? It seems a little too simple...218.25.32.210 (talk) 07:47, 30 November 2009 (UTC)[reply]

I was unable to understand your question. Could you perhaps include an example to make it clearer? --Sean 14:25, 30 November 2009 (UTC)[reply]
If I remember correct, different data types (different INPUT types, really) return differently depending on what happens if they are blank. So if it is a text string, you get a blank text string in your return array. But if it is, say, a checkbox, then you get NOTHING in the query array when you submit it empty, which means you have to then look for it to know that it was returned as blank. (That is, you don't get a variable with nothing in it — you get no variable. So you only see the variable when it has been positively checked.)
But other than that -- it's pretty straightforward, yes. You just have to make sure that you are careful about how the different datatypes return, and of course all of the different INPUT types have different ways of indicating default values (irritatingly enough). --Mr.98 (talk) 14:41, 30 November 2009 (UTC)[reply]

C# abstract classes

In C#, is it possible to specify that a given nested class A inside an abstract class B is abstract, but that in non-abstract subclasses it must not be? Also, is it possible to specify that where said B.A implements IEnumerator<B>, the C.A inside any given subclass C must implement IEnumerator<C>? NeonMerlin 07:56, 30 November 2009 (UTC)[reply]

Use phone as bluetooth repeater from laptop to headphones?

Can I configure my Samsung VICE R561 phone to serve as a repeater for music my laptop sends to my headphones over Bluetooth A2DP? The VICE itself supports A2DP, its application platform is J2ME, and I'd be willing to install third-party firmware if necessary as long as I can still use the phone functions that I consider to be core (talk, text, WAP, alarm clock, appointment calendar, camera, phone's own music player). NeonMerlin 10:16, 30 November 2009 (UTC)[reply]

Is it good programming practice to use state variables?

I am only an amateur programmer. Currently I am writing a subroutine that checks that the data in a text file is in the right format when loaded into the program. For example I want to ensure that the command word "start" is followed by the command word "finish" exactly five lines later, and if not give an error message. Is it best practice just to have a variable that goes to a value of 1 when "start" is found, and then counts up every file line until it gets to 6 when "finish" should appear, or is there some better way to do it? There are other state variables as well. I'm thinking that state variables may be difficult to follow if I want to revise the program in the future, and perhaps lead to spaghetti code. Thanks. 78.146.171.75 (talk) 14:38, 30 November 2009 (UTC)[reply]

It's difficult (and often too subjective) to be so definative about what is good (or, perhaps, bad) practice, particularly in such an abstract circumstance. In some circumstances it may be best to have state implicit in the program structure (e.g. we can't get to this line unless we've already seen the start condition). In other cases it might be better to explicitly have a state machine. In general, when it comes to variables, states, and function names, I have a (rather whorfian) theory - if you can give things a sensible name (e.g. STATE_AWAIT_END_TOKEN, wait_for_next_line(), bPacketHeaderReceived) then you still understand your program, but when you define some thing that owes its existence to the exigencies of your specific implementation, where that variable means this-is-true-but-not-that-unless-something-else (where you can't give that condition a plain english definition), then your understanding of what you're coding may be slipping. -- Finlay McWalterTalk 15:06, 30 November 2009 (UTC)[reply]
I would say that state variables are a good concept. The implementation of "state" can range from great to terrible. Compare, for example, Entity Java Beans, which are basically persistent state variables; against Global variables in C or C++. In the first case, the persistent state is well managed by the programming language syntax, the runtime environment, and a lot of the abstractions are properly and completely handled. In the second case, global variables in C++ tend to result in a leaky abstraction because they don't enforce certain requirements (like thread consistency, concurrent access, locking, validity checking, scope, etc). So, while a state variable is probably a good thing in general, its implementation determines whether it is a good design choice for your particular problem. Nimur (talk) 15:41, 30 November 2009 (UTC)[reply]
It's good that you're thinking about these issues, but I wouldn't sweat the style/understandability of anything that can fit in just a few lines of code like what you're describing. Lexically scoped variables in short functions are usually obvious even without decent names. --Sean 18:06, 30 November 2009 (UTC)[reply]

Thanks. I'm curious, would it be possible to write a program entirely as a giant state table, apart from where the program interfaces with other things? 89.242.99.245 (talk) 19:54, 30 November 2009 (UTC)[reply]

System admin blocks access to C: drive

My system administrators at work have blocked access to the C: drive on my office computer. I can only save files on the network. As a result nothing is saved (cookies, passwords, MS Office customizations), it all disappears overnight and has to be laboriously reinput each morning. As you might imagine, this is incredibly annoying – and also bad for productivity. I've raised the issue with the IT department but to no avail. Is there any way for me to unblock this access? Thanks. --Bluegrouper (talk) 16:02, 30 November 2009 (UTC)[reply]

If you actually value your job I really wouldn't recommend trying to bypass the access restrictions. They've been put in place specifically and depending on your contract, trying to get past them could be seen as a breach of contract. It sounds very much like they don't want people customising the computers so in reality you're better off either not changing them back every day (which is the loss of productivity) and if there is something specific that would help you out, you're best off explaining that change rather than asking for complete access. Incidently, regarding only able to save files on the network, this is probably for backup purposes as the individual machines probably aren't backed up, but the network fileserver is. ZX81 talk 16:32, 30 November 2009 (UTC)[reply]

Google Earth

I lay out a path on Google Earth, do the whole mileage thing, and it all works great. But when I go to close out, I really need a way to save this (my plot is over 600 miles); I'm trying to plan out a major trip and need Google Earth to pull through for me. Thanks! Hubydane (talk) 16:42, 30 November 2009 (UTC)[reply]

You could try using some different software. Map24 is pretty good for this sort of thing. --Richardrj talk email 16:53, 30 November 2009 (UTC)[reply]
When you have your directions in the "Directions" window, just drag the direction name (e.g. "New York, NY to Niagara Falls, NY") with a little blue globe on it into the "Places" window, and it will save it there. Next time you want to view it, just click on it in the "Places" window. --Mr.98 (talk) 17:30, 30 November 2009 (UTC)[reply]

My apologies; I'm not actually doing a road trip. I'm kayaking a river, so I can't exactly do from point A to point B and get directions. —Preceding unsigned comment added by Hubydane (talkcontribs) 18:29, 30 November 2009 (UTC)[reply]

C question

I have a structure like that given below:

struct a			
{
 int **aP;
 int size;
};

void main()
{
 struct a* obj;
}

How do I access aP (and set its values) inside main, using variable obj? Please help. I've tried everything. Out of luck today.--117.196.129.72 (talk) 16:50, 30 November 2009 (UTC)[reply]

(*obj->aP)[i]
you'll have to allocate space for the structure and the array first of course. Does aP really need to be a pointer to a pointer?—eric 17:20, 30 November 2009 (UTC)[reply]
Or obj->aP[j][i] or *obj->aP[j] depending on what you're trying to do. What is aP? -- BenRG (talk) 17:45, 30 November 2009 (UTC)[reply]

Only using 50%

On Windows 7 some programs only ever use 50% max cpu when they clearly need more, but Windows is apparently limiting them to only 50%. How can I let them use the full 100% ? —Preceding unsigned comment added by 82.44.55.75 (talk) 17:33, 30 November 2009 (UTC)[reply]

Most likely you have a dual-core CPU and the program is using 100% of one core and 0% of the other. Some programs may have a "number of threads" or "number of cores" option that you can set to 2 instead of 1. Otherwise, there's nothing you (or Windows) can do to make the program use the second core. -- BenRG (talk) 17:42, 30 November 2009 (UTC)[reply]
To elaborate a little further: whether a program can use multiple cores depends on how it is programmed. Many programs these days can seamlessly use multiple cores (games are pretty good at this, for example), but many cannot. Here is an article from a couple years back now that discusses how some programs can really make good of multi-core processors, while others cannot. It is primarily about going from 2 to 4 cores, but the basic concept is the same. Programming for multiple cores is not trivial, depending on the type of program it is—some are really amenable to having multiple threads working in parallel, but some are not. --Mr.98 (talk) 18:23, 30 November 2009 (UTC)[reply]

Another question regarding this

I (not the original OP!) have another question about this: I know that if you write an app that will consume as much CPU time as possible (e.g. by entering an infinite loop), it will only use 50 % of my CPU power (my CPU has two cores). However, when I read the indicators on the CPU usage Vista sidebar gadget I use, it says that Core #1 is used to 50 %, and Core #2 to 50 % as well. Shouldn't it be 100 % and 0 %? --Andreas Rejbrand (talk) 18:54, 30 November 2009 (UTC)[reply]

I'd question the accuracy of the sidebar gadget, because I'd also expect it to be 100% and 0% if it's only using one thread. It could be the gadget is just taking the total load and assuming it's shared across two processors although that's a pretty bad assumption if it is. The way to really check would be to look on Task Manager's "Performance" tab which will have a graph for each processor. Run your loop and see if only one of the processors greatly increases (I just tried a simple "do loop" in VB6 and on my quad cord only one processor increased as expected. ZX81 talk 19:10, 30 November 2009 (UTC)[reply]
It appears like the gadget is correct. See cores.png @ privat.rejbrand.se for a screenshot. I know that AlgoSim only use one thread for the factorization (because I wrote AlgoSim), and the CPU is almost not doing anything besides the factorization. Both cores were almost at zero before the factorization began. --Andreas Rejbrand (talk) 19:15, 30 November 2009 (UTC)[reply]
It's also possible that it is rebalancing the load between cores frequently enough that the split ends up being roughly 50/50. I'm more inclined to suspect a poorly written gadget though. There would be little to no benefit in rebalancing the task across cores, and you'd throw away cache every time you did so. —ShadowRanger (talk|stalk) 19:17, 30 November 2009 (UTC)[reply]
Posted that without noticing the response you made. Looks like my speculation was correct though. —ShadowRanger (talk|stalk) 19:18, 30 November 2009 (UTC)[reply]