Wikipedia:Reference desk/Computing

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

The Wikipedia Reference Desk covering the topic of computing.

Welcome to the computing reference desk.
Want a faster answer?

Main page: Help searching Wikipedia

How can I get my question answered?

  • Provide a short header that gives the general topic of the question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Post your question to only one desk.
  • Don't post personal contact information – it will be removed. All answers will be provided here.
  • Specific questions, that are likely to produce reliable sources, will tend to get clearer answers.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.

How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
Choose a topic:
See also:
Help desk
Village pump
Help manual

November 7[edit]

Intel Xeon versus i7[edit]

Comparing a single core, how would the performance of a 3.4GHz Intel Sandy Bridge i7 compare to a 2.4GHz Xeon E5-2695v2 ? This would be on integer operations only, often with a lot of reading and writing to memory. The i7 has a higher clock speed but the Xeon is a newer design with more memory cache. Bubba73 You talkin' to me? 05:29, 7 November 2018 (UTC)

This can't really be answered without more information. Critical information would be: (1) the exact ratio between arithmetic operations (internal to the CPU) and memory accesses, (2) whether both systems have the same L2 cache size and design, (3) how the size of the working set of the program compares to that of the L1 and L2 caches, (4) if the working set is larger than the L1 cache, some analysis of the program's locality of reference would be needed to understand how much benefit the caches provides. This type of analysis is in most cases fairly difficult to do analytically; it's much easier to just test the program on both architectures. CodeTalker (talk) 16:23, 7 November 2018 (UTC)
I can't really directly compare the two because I have some Sandy Bridge i7s but I'm thinking about buying a Xeon workstation. The mix of integer operations and memory operations will vary from program to program, but the memory-intensive stuff mainly accesses the memory in sequence. The i7s have the standard caches. The workstation I'm considering has two 12-core Xeon E5-2695v2 running at 2.4GHz, 64GB of ECC RAM, but I don't know about the memory cache sizes. Bubba73 You talkin' to me? 02:16, 8 November 2018 (UTC)
Depends what you are doing but I've generally found passmark ( to be reasonably predictive. (talk) 20:49, 10 November 2018 (UTC)
Thanks - is that Passmark score running on one core or all of the CPU's cores? It gives 8,196 for the Sandy Bridge i7-2600 and 15,847 for the Xeon E5-2695 v2. That is 1.93x for the Xeon, but it has 12 cores compared to 4. If that is per core, that is great for the Xeon, but if it is for the whole CPU, that isn't so great for the Xeon (which has 3x as many cores). That 1.93x ratio is pretty close to 2.4*3/3.4 = 2.12, which would indicate that a Xeon doesn't quite core doesn't quite match a i7 core, per GHz, and the Xeon runs at a lot lower clock speed. Bubba73 You talkin' to me? 04:06, 14 November 2018 (UTC)
Well, I found this, which says that it runs on all available CPUs. Which means that the 12-core Xeon E5-2695 v2 is a little less than the equivalent of 8 Sandy Bridge i7 cores. Bubba73 You talkin' to me? 04:12, 14 November 2018 (UTC)

November 8[edit]


I'm trying to use ImageMagick to crop a bunch of images of various sizes. I want to remove the leftmost 10% and the rightmost 10% of each image. Let's say the image is 1000 width by 1000 height, I want to remove the leftmost 100x1000 rectangle, and the rightmost 100x1000, so that I end up with an resulting image of 800 width by 1000 height. Reading its manual[1], this is seems easy to do using the -crop command if all the images sizes were identical, so that I can pre-calcuate the crop rectangle XY coordinates in advance. But when the input images have different dimensions, I can't seem to find a way to use percentage values to specify the crop rectangle size and location. Any and all help would be much appreciated. Mũeller (talk) 12:28, 8 November 2018 (UTC)

  • @Mũeller: The section you linked tells you see Image Geometry for complete details about the geometry argument. A bit more hunting finds that page, based on which I guess (without having tested it) that what you want is -gravity Center -crop 80x100%+0+0 image.out. It does not seem possible to specify offsets in percentages, but since you want to crop an equal amount left and right, you can crop "from the center". TigraanClick here to contact me 12:43, 8 November 2018 (UTC)
This post on the imagemagick forum gives some examples of how to crop offsets in percentages EniaNey (talk) 15:04, 14 November 2018 (UTC)
Thank you so much! That worked perfectly! Mũeller (talk) 15:19, 8 November 2018 (UTC)

November 9[edit]

Can people rob a mobile phone having just the NUMBER?[edit]

I ran across an article that made it sound like cramming (fraud) could be done by anybody that has a mobile phone number, which seems hard to believe. But the FTC makes it sound the same way. Our article Cramming (fraud) makes it sound like somehow cashing a rebate check is needed, or something. What's the truth? How hard is it to take a payment from a mobile account?

Additionally, I'm curious if this is being done to "burner" phones (TracFone and cogeners). Those don't have a monthly bill, but could scammers take money out of the accumulated minutes or something?

Also, is this actually useful for a small legitimate website as a way of collecting money? How much is the phone company's markup, and are there onerous bureaucratic requirements only a scammer could pass? Wnt (talk) 13:14, 9 November 2018 (UTC)

The scheme involved here - as detailed in this lengthy report from the FTC - requires your phone carrier to accept the charge first, and then to forward the charge to you in your bill. In a sense, your telephone service provider is acting as a credit-provider to a third-party, on your behalf - and consequently, it is the responsibility of the telephone service provider to evaluate the risks of extending credit to a possibly-non-creditable third-party.
So, when you ask whether anyone can conduct this type of scam, the answer is "no." Only people who are trusted by your phone company would be able to get money out of this scheme. That category of merchants - legitimate or otherwise - would send a bill to your phone company; your phone company would pay them, track the charge, and some time later, you'd receive the bill. Functionally, it only works for merchants who are "white-listed" - authorized by your service provider to levy a charge.
That bureaucratic step alone is enough to make this type of scam difficult to execute. If a random individual, with no legitimate credentials and no legitimate purpose, began a large-scale effort to send charge forwarding or "other forms of carrier billing arrangements" to a major company like AT&T, they would just ignore you. You would never get any money from AT&T out of the scheme, and the people you attempted to victimize would never know that you were hassling their carrier with fraudulent paperwork; and the more that you significantly attempted this effort, the greater the probability that somebody would pursue civil or criminal charges against you.
The scam works best against the customers of little tiny mobile service carriers. The people who run those carriers might not realize that when they enable this technical service, they're effectively offering credit to anyone who asks for it - which is always a recipe for attracting scam-artists, even if it attracts other legitimate users as an ancillary detail.
Large and respectable phone providers will only honor charges billed from large and respectable merchants. This is where accountability comes in to play; the bigger the scam you run, the harder it would be for you to stay hidden while conducting fraudulent activity.
Like everything else in commerce, it is possible for a few bad guys to make a little scam work for a little while - but the bigger they get, the harder it is for them to hide; and if they don't get large, they don't get any meaningful amount of money. Taking it mainstream would require hiring and paying more bad-people to help with the details. Taking it mainstream would also raise the hackles of the end-consumers, and then by proxy would raise the hackles of the major carriers, who would cease forwarding the charges from that particular scammer.
For the same reason, cheque fraud doesn't work, even there's no real technical protection to prevent a bad check. A check is basically just a piece of paper with a bunch of numbers on it. You can walk into any major bank in the United States and try to cash a totally fictitious check from a totally fictitious account. Most big banks would verify the details, and would throw you out if the account was false, and even if it was legitimate, they won't give you money if you don't have valid ID, already have a valid account, and demonstrate the other things that scammers don't typically want to present (because they don't want to get caught).
Some smaller banks (like those yucky payday loan establishments) might actually let an anonymous entrant walk into the store and cash a check with no questions asked. They incur the risk that you might be scamming them, and they mitigate this risk by taking most of your money from you up front before they cash the check. They also mitigate their risk by associating with the scary folks in the dark alley they usually keep behind their not-so-pleasant store-front: if your check is bogus, they find you and get their money back. We have an awfully white-washed article sub-section on their ... ahem, "aggressive collection practices."
Nimur (talk) 22:39, 9 November 2018 (UTC)

November 11[edit]

mass conversion to MP3[edit]

I drive a vehicle that can play MP3 files from a thumb drive. As fate would have it, most of my music files are in another format, being from CDs untimely ripp'd by iTunes, so I didn't have quite enough music for a recent six-hour trip.

Is there (in MacOS) a convenient way to copy thousands of AAC files to MP3? —Tamfang (talk) 05:09, 11 November 2018 (UTC)

Depending on your definition of convenient, it can be done in itunes. HenryFlower 15:30, 11 November 2018 (UTC)
Or not. Where that page says I should look for “Create MP3 Version” in the menu, I find only “Create AAC Version” (from AAC files)! —Tamfang (talk) 23:07, 11 November 2018 (UTC)
If you are willing to work in a command box, FFmpeg can convert audio files (not just video). Graeme Bartlett (talk) 22:36, 11 November 2018 (UTC)
If by "command box" you mean Unix shell, that's fine, I didn't get where I am today without writing bash scripts! —Tamfang (talk) 23:07, 11 November 2018 (UTC)
I have found Foobar2000 to be a great program for mass converting to MP3. It can resample, filter, make it so the loudness doesn't vary so much from song to song, etc. I just run it on my Windows box (I use Windows 10 where I have to and Slackware whenever I can -- love Neovim, hate Word, use Libre Office when a client insists on docs in Word format) but they claim to have a version for mac [ ]. I haven't tried it, but Foobnix might be an equivalent for Linux: [ ]. If you try either of those, please drop me a line on my talk page so, if possible, I can do one less thing on Windows and one more thing on Linux. --Guy Macon (talk) 23:26, 11 November 2018 (UTC)

CAD and gaming graphic cards[edit]

I was told that CAD software and gaming software use kinda different graphic cards. CAD professionals would prefer Quadro and Gamers would rather buy GeForce. Basically the rationale is that CAD need fidelity up to the pixel, and games need FPS, the more the better. However, what's the big deal if CAD software gets one pixel wrong? It's not as if we could see individual pixels. How big of an issue is this distinction? --Doroletho (talk) 20:41, 11 November 2018 (UTC)

CAD users will use large monitors where they probably can see one pixel out. If the freehand drawing is out then the design will be wrong, so the monitor should represent what the designer wants to draw accurately. I don't expect graphic cards will make this kind of error anyway, but the CAD designer will need large and multiple monitors, possibly with good colour resolution and depth. Graeme Bartlett (talk) 22:34, 11 November 2018 (UTC)
The needs are different, but what speaks for the need for a different GPU? Couldn't they make a multi-purpose GPU? What makes the design of an architecture for speed be different than designing for precision? After all, CAD or games, the GPU must be flexible enough to deal with different kinds of CAD and games. --Doroletho (talk) 22:53, 11 November 2018 (UTC)
The requirements really are different. With games it's all about speed, speed, and more speed, and if the trees whipping by on a blur or the rocket that is about to kill you look a bit off, nobody cares. With CAD, you do care.
Drivers are also a big difference. The CAD vendors write drivers that are highly-tuned to the exact specifications of the CAD cards, and there is very little variation for them to deal with. Gaming cards come from a multitude of vendors who often deviate from the original reference design. This is really the main dealbreaker. If you are using, say Solidworks, you use the computer and graphics card they recommend and support.[2][3]
Another huge difference involves power consumption. Gaming cards produce a lot of heat, but they generally do it in short bursts. CAD cards are used for 8-12 hour work days, and are often given 24/7 rendering tasks in the background. They are also quieter; a bunch of screaming fans pumping a ton of heat into the room might be OK for a gamer wearing headphones, but not in an office with 30 workstations in one room.
And don't even get me started on the difference between mechanical CAD and PWB layout... --Guy Macon (talk) 23:50, 11 November 2018 (UTC)
Another major detail is whether the vendor of a specific software - say, Autodesk (the makers of AutoCAD and many other popular tools) - have specifically certified that they design and test their software on a specific graphics hardware and software configuration.
The vendor may publish generic requirements - for example, the Autodesk AutoCAD tool is compatible with any "Direct3D®-capable workstation class graphics card" - but when you're buying software that costs more than your house, it's worth knowing that a team of engineers will stake their professional reputation on the specific details, and will publish documentation and provide support for your configuration.
Nimur (talk) 18:47, 12 November 2018 (UTC)
I won't restrict this answer to CAD since workstations cards are used for a lot more than CAD work. One thing that wasn't mentioned above is that workstation cards often have feature sets that are excluded from gaming cards sometimes for market segmentation reasons, sometimes for costs reasons. (In the fairly distant past, some of the cards were similar enough that you could BIOS mod a GeForce so that it appeared as a Quadro or I believe likewise for ATI. This enabled whatever was disabled, often specifically driver support although for the later, simply hacking the driver was another solution. But things have moved on since then.) A common recent difference is in double precision performance or support although of course not all applications (by which I don't just mean programs) will use those features see e.g. these discussions [4] [5]. Deep colour support is I believe an even more recent feature in this vein. ECC RAM is another one although that's obviously primarily cost related. (Although I suspect vendor would add ECC to a GeForce or AMD 'consumer' card if they could so I guess you could still say the exclusion from the chips is most likely largely for market segmentation reasons.) This although a user comment [6] is IMO illustrative of the complexity. At the high end, the are also differences in best speced card, especially regarding RAM since as crazy as gamers can be, they still often have budgets and Nvidia or AMD likewise need to make a profit so aren't going to release something if few are going to buy it. And adding 48GB of high performing RAM to a card is going to make it somewhat expensive. (And it needs to be the top card since again as crazy as some gamers are, most are going to realise they will be loled out the door when they show off their fancy rig with a fancy card with 48GB of RAM which will get a lower FPS in every single game than that other fellow's rig with a far cheaper card.) Nil Einne (talk) 16:34, 13 November 2018 (UTC)

November 14[edit]

Help with an Excel formula[edit]

So, I'm working with an excel spreadsheet. In the spreadsheet I've got the birthdates of children, and I want it to put in another cell how many months from whenever now is until the child turns 18. I've constructed a formula that I think ought to work, but it's giving me a #NUM! error.

Here's what I'm trying (the kid's birthdate is in E2): =DATEDIF((DATE(YEAR(E2)+18,MONTH(E2),DAY(E2))),TODAY(),"m")

Anyone have ideas on how to make it work? ~ ONUnicorn(Talk|Contribs)problem solving 19:55, 14 November 2018 (UTC)

Rather confusingly, DATEDIF takes the "start date" as its first parameter and "end date" as its second, so if you change the formula to =DATEDIF(TODAY(),DATE(YEAR(E2)+18,MONTH(E2),DAY(E2)),"m") you should get the result you want. (You have a redundant pair of parentheses around the DATE() in your first parameter, which I've removed in the corrected version). AndrewWTaylor (talk) 20:45, 14 November 2018 (UTC)
Yay! That works! Thank you so much! ~ ONUnicorn(Talk|Contribs)problem solving 20:49, 14 November 2018 (UTC)

November 16[edit]