Wikipedia:Reference desk/Archives/Computing/2011 February 12

From Wikipedia, the free encyclopedia
Computing desk
< February 11 << Jan | February | Mar >> February 13 >
Welcome to the Wikipedia Computing Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


February 12[edit]

Change language of OpenProj?[edit]

I eagerly downloaded and installed OpenProj after reading about it here, only to find that it defaulted to Simplified Chinese for all menus and dialogs. That is indeed my geographic locale, but I'm using an English copy of Windows XP. I can find no option/preference setting to change the language. A google search only uncovered other people with the same problem. This is quite irritating! Can someone help me force OpenProj to display in English? The Masked Booby (talk) 03:19, 12 February 2011 (UTC)[reply]

Maybe if you change your locale to somewhere English, maybe it will display in english. General Rommel (talk) 06:50, 12 February 2011 (UTC)[reply]
The solution given here http://sourceforge.net/p/openproj/feature-requests/117/ (second post by Barberousse) worked for me. The system language of the Windows on my computer is Russian. I replaced the content of openproj.bat file, which is in the Open Proj work directory, with this code: "java -Duser.language=en -Duser.country=US -Xms128m -Xmx768m -jar openproj.jar". Then I just launched openproj.bat (to do it you need to right-click on the file and choose "Run" from the context menu) and enjoyed the interface in English. Good luck! dancingyogi (talk) 06:53, 10 October 2013 (UTC).[reply]

AES-256 Encryption Parallellism[edit]

Resolved

Hello RefDesk volunteers,

  I want to know whether data encryption is a primarily serial or parallel processing task, i.e can data encryption be executed faster using e.g. twelve CPU cores compared to a two CPU cores? The encryption method of interest here is AES-256, using a password with a length of between 12 and 20 characters.

  Thank you! Rocketshiporion 12:36, 12 February 2011 (UTC)[reply]

If you mean performing a single encryption on multiple cores, it's very difficult to properly parallelise most symmetric encryption functions, as they display the avalanche effect very strongly (quite deliberately - that's from where much of their security is derived). This is especially true if the underlying cryptosystem involves a chained inter-block dependency (e.g. CBC mode); non-chained modes (e.g. ECB) can be parallelised, but they're less frequently used, as they have poorer security characteristics. The most common strategy to increase throughput of such a cipher is to vectorise the computation, where each stage of the computation is handled by a different computational element. For the operations done by AES, CPUs are a poor choice to handle vectorisation, as the cost of handing off each block to another core (and the consequent loss of register and cache coherence) dominates the relatively minor cost to actually computing each phase. High-performance calculation of ciphers like this is typically performed by an ASIC, where vector stages are physically and logically adjacent to one another, making the handoff inexpensive. Where CPUs are used to do bulk AES it's to perform unrelated operations; a password cracker (where each attempt is essentially unrelated to, and not dependent upon, other attempts) fully exploits the parallelism available. -- Finlay McWalterTalk 15:29, 12 February 2011 (UTC)[reply]
In short, yes, if...
  1. you have more than (block size × number of processing units) = (16 × 24) = 384 bytes of data to encrypt (or decrypt), and
  2. the mode of operation your program uses can be parallelized.
For example, TrueCrypt is able to take advantage of additional CPU cores, since it usually encrypts (and decrypts) megabytes of data at a time and uses the XTS mode of operation, which is parallelizable. 118.96.166.75 (talk) 03:16, 13 February 2011 (UTC)[reply]
CTR mode is fast, embarrassingly parallel, and secure as long as you authenticate the decrypted message. XTS is a bit slower and more complicated; it has advantages in full-disk encryption but there's no need to use it for ordinary streaming encryption. I don't know what 7-Zip uses. Note that AES doesn't define any way of encrypting a file with a password; everybody has their own way of doing it. -- BenRG (talk) 22:11, 13 February 2011 (UTC)[reply]

Objective C Pointers[edit]

When working on programming, I came upon a question. Whenever I'm creating an NSString just with stringWithFormat, I have to make the variable a pointer variable (i.e. it has the asterisk), as in:

NSString* myString = [stringWithFormat:@"this is a string"];

However, if I'm creating an NSRange, it doesn't need to be a pointer variable, as in:

NSRange range = [myString rangeOfString:@"this"];

So what I want to know is, Why do I need the asterisk with NSString variables, but not with NSRange variables? --Thekmc (talk) 14:00, 12 February 2011 (UTC)[reply]

Anyone? --Thekmc (talk) 22:08, 12 February 2011 (UTC)[reply]

According to this page, an NSString is an object, but an NSRange is only a struct. Looie496 (talk) 05:09, 13 February 2011 (UTC)[reply]

So your saying that if I am creating an object, I need to make a pointer to it, but if i'm making something that is essentially a C type (like a struct) I don't need to make it a pointer? --Thekmc (talk) 21:43, 14 February 2011 (UTC)[reply]

What flavor for Puppy Linux?[edit]

If you have to install a Linux program that comes in the Ubuntu, Debian, Fedora and SUSE flavors, which one is the most likely to fit? Quest09 (talk) 14:03, 12 February 2011 (UTC)[reply]

You should search for one for Puppy Linux first. Then, if you can't find it, use woof on the Ubuntu binary package. -- kainaw 16:21, 12 February 2011 (UTC)[reply]

The only logical reason to use Puppy Linux is if you have a hard drive so small you are literally stuck in the late 80s or earlier. ¦ Reisio (talk) 03:41, 13 February 2011 (UTC)[reply]

Maybe the OP likes puppies? --Ouro (blah blah) 09:22, 13 February 2011 (UTC)[reply]
Reisio, perhaps you have only tried to run Linux on a system that you might be able to call a "personal computer." However, one of the most powerful features of Linux is the ability to run it on other platforms - many which do not even have a hard disk drive. I can list off a few dozen reasons why one might want a tiny version of Linux. I do not think it's fair to call that "1980s technology" - especially because Linux didn't exist in the 1980s. At the very least, this is 2010s technology.
Quest09, you might find a Debian package (.deb) the easiest to install on Puppy Linux. Alternately, you can compile from source-code - a lot of your favorite Linux programs will provide freely licensed source code, to help portability to different types of computers and operating systems. Nimur (talk) 20:59, 13 February 2011 (UTC)[reply]
Yes, broad architecture support is a feature of "Linux", but not Puppy Linux. I can also list off a few dozen reasons why one might want a tiny version of Linux, which is why I stated the only reason to use Puppy is if your primary concern is that it fit on hardware so ancient as to have incredibly low storage capacity — I was referring to hardware from the late 80s, not software. ¦ Reisio (talk) 16:00, 14 February 2011 (UTC)[reply]
Friends, don't quarrel over something as nice and as useful as Linux. I use Fedora. --Ouro (blah blah) 17:20, 14 February 2011 (UTC)[reply]

3d or 3d?[edit]

If you have two 3d pictures , but in one case you can turn it, and in the other case, just look at it from one specific angle, how do you call this difference? Both are 3d and none is real, but one has more 3d in it. Quest09 (talk) 16:08, 12 February 2011 (UTC)[reply]

Are you meaning 3-D holographic for the one that you can turn? --Aspro (talk) 17:16, 12 February 2011 (UTC)[reply]
No, just a 3d object on a normal screen. But, one that can be virtually turned around. Quest09 (talk) 18:30, 12 February 2011 (UTC)[reply]
Are you talking about postcards in space, also known as "2.5D"? In other words, one of the images can be rotated, but looks like a postcard once it is rotated enough?--Best Dog Ever (talk) 19:41, 12 February 2011 (UTC)[reply]
No, I'm asking about two quite simple 3ds: imagine you drew a cube with Blender. You can rotate the cube on 3 axis. And now, imagine you drew a cube with GIMP. They are different, even if the GIMP image is a 3d drawing. Do you know what I mean? Quest09 (talk) 20:51, 12 February 2011 (UTC)[reply]
The one that is not really a 3D model is just a Graphical projection. --Mr.98 (talk) 21:21, 12 February 2011 (UTC)[reply]
Well, anything rastered to screen is also a graphical projection. It just so happens that software like Blender, which stores a complete 3D geometry for each (x,y,z) point, has the algorithmic capability to render many different projections from arbitrary view-angles and distances, while a hand-drawn projection is a "one-shot" deal that is a fixed, 2D set of (x,y) geometries. There are intermediate models, too: sophisticated computational geometric models exist that can be rendered from multiple view-angles, without being a complete 3D representation of the object from all view angles. Geometric modeling or 3D modeling might be a useful read. We also have solid modeling, surface modeling, point clouds, and many other interesting ways to describe 3D objects to a computer. In general, it is prohibitively expensive (uses too much memory) to store every point of a 3D object (in fact, it is theoretically impossible: at best, we can sample a quantized subset of the object's geometry). So computer scientists have come up with thousands of clever tricks to reduce the complexity of the model while still producing a high-quality raster representation for some particular use. Nimur (talk) 21:27, 13 February 2011 (UTC)[reply]

Sporadic wireless connectivity[edit]

I am havign trouble connecting to my wireless network. I have an XP with a RealTEK wireless adapter. Every few seconds it will go to full bars/"Excellent" signal, but then it will go back to "Not connected", then back to Excellent signal, and so on. This happens whether I use the driver that came with the hardware or Windows to configure it. It always says "Acquiring network address" when it is at "Excellent" and sometimes I can access the internet briefly, but it always goes back to "not connected"/all red. Why is this, and how do I fix it? 72.128.95.0 (talk) 17:18, 12 February 2011 (UTC)[reply]

The problem doesn't necessarily lie with your computer and/or configuration. Are you able to check the same wireless network on a different computer? What is the behaviour then? --Ouro (blah blah) 18:29, 12 February 2011 (UTC)[reply]

Multi-threaded Data Compression[edit]

Hello again,

  I'm currently using 7zip to make compressed and encrypted data backups. My compression method of choice is LZMA, with a 1024MB dictionary size and 256-byte word size using the maximum of 4 CPU threads. My chosen encryption method is AES-256 with a password which is between 12 and 20 characters in length. I have two six-core processors, each of which support twelve cores, and would like to use all 24 threads to compress and encrypt data. So my question is as follows: does a software product exist which fulfils the following requirements?

  • Compresses data using the LZMA method with a 1024MB dictionary and 256-byte word size.
  • Supports a minimum of 20 CPU threads for use in data compression.
  • Encrypts data using the AES-256 algorithm.

  Thanks. Rocketshiporion 18:12, 12 February 2011 (UTC)[reply]

I'm pretty sure nothing there is will meet those standards, because hardly anyone has 24 CPU threads and out of those few even fewer probably wish to use them for data compression. The only thing that you might want to give a shot is asking a programmer whether he can write a mod/plugin for 7zip that allows it to use all of your threads...--87.79.212.251 (talk) 20:32, 12 February 2011 (UTC)[reply]
Does any software exist which can do LZMA compression using anything more than four CPU threads? Rocketshiporion 21:18, 12 February 2011 (UTC)[reply]
Take a look at this article, especially its comments section. Apparently, LZMA is not very parallelizable. One way to parallelize it more would be to break the data into several chunks and compress them independently, but that will hurt the compression a lot. 118.96.166.75 (talk) 03:00, 13 February 2011 (UTC)[reply]
LZMA appears to be yet another LZ77 variant, and LZ77 variants are normally highly parallelizable because the dictionary at any point is just the last n bytes of the file. LZMA probably has some additional context for the range coder, and you would lose some compression ratio if you periodically flush that, but you don't have to throw away the 1GB dictionary. Some comments in the linked thread make me think that Igor Pavlov might have neglected to provide a way to flush the range-coding context in the file format. In that case a change to the format would be needed, or a custom decompressor. But that would be nothing to do with the algorithm itself, it would just be an oversight in the file format. -- BenRG (talk) 21:11, 13 February 2011 (UTC)[reply]
One more question: is there any software which can do data compression (with any other algorithm) using anything more than four CPU threads? Rocketshiporion 19:38, 13 February 2011 (UTC)[reply]
I believe 7-zip will efficiently use arbitrarily many cores for bzip2 compression. This has no cost in compression ratio because bzip2 independently compresses 900,000-byte chunks even when single threaded. But that means that bzip2 does poorly on files that benefit from a huge LZ dictionary (such as full Wikipedia dumps), so it probably won't meet your needs.
There's a parallelized xz called pxz, but (based on the extremely limited documentation) it appears to just divide the file into chunks, without sharing dictionaries. -- BenRG (talk) 21:30, 13 February 2011 (UTC)[reply]
Doesn't LZMA2 have better multithread support? That's been in 7-zip beta for ages. I believe the compression is better with less threads however and it may just use chunks. Nil Einne (talk) 16:02, 14 February 2011 (UTC)[reply]

Search engine that ignores robots.txt[edit]

Greetings, this will probably make me sound like a complete greenhorn but could anyone recommend a good search engine that pays no attention to robots.txt? Thanks a lot in advance.--87.79.212.251 (talk) 20:28, 12 February 2011 (UTC)[reply]

Just use any search engine, and put
-"robots.txt"
in the search string. Rocketshiporion 21:25, 12 February 2011 (UTC)[reply]
The questioner is referring to the robots exclusion standard, not a specific query. All search engines are supposed to use it. Many people have written articles that claim Google and Bing (and any other search engine) don't obey the robots.txt file. However, what I see in those complaints is that the robots.txt file is formatted incorrectly and, therefore, ignored. It is not a case of the search engines purposely ignoring all robots.txt files. -- kainaw 21:30, 12 February 2011 (UTC)[reply]
robots.txt is mainly used to exclude things that are dynamic and possibly large and have no value as search results. (For example, things like Special:RecentChanges.) Search engines exclude those things not just to be good citizens of the Web, but also to improve the quality of their results. People don't generally (if they know what they're doing) use robots.txt as a security device to hide things; if they want them hidden, they just need to deny access to the outside world altogether. Paul (Stansifer) 14:16, 13 February 2011 (UTC)[reply]

Ignoring robots.txt would be dangerous! One of the major uses of robots.txt is to ensure that "robots" (web crawlers) do not follow links that contain dynamic content or could cause the server to do a lot of work. For instance, Wikipedia's robots.txt disallows access to /wiki/Special:Search and /wiki/Special:Random, among other pages. Allowing crawlers into the search page would allow them to scrape the search index, whereas allowing them into the random page would cause them to get random content, which would be undesirable. --FOo (talk) 18:00, 13 February 2011 (UTC)[reply]

Yes, that's all great, but it's not helpful at all because I didn't ask why most search engines don't ignore it, but rather which search engines do. I know that robots.txt is not used often to hide things but I am trying to investigate a strand of servers that have popped up and they all did use robots.txt to make it hard for people to locate them via search engines. Sorry to be so frank, but I really need a link, not excuses as to why you don't know of any.--213.168.109.74 (talk) 20:21, 13 February 2011 (UTC)[reply]
We have an article on something called Botseer. Its website seems to be down, but apparently its purpose was to index robots.txt files. There's an archived version of part of the site at [1], but it probably won't be much use for your purposes (it has things like statistics on how many robots files mentioned some well-known bots). Perhaps what you're asking for just doesn't exist: people have asked this question before on other sites and got no useful answers. Perhaps you could be more specific about what exactly you are trying to do, somebody might have ideas about other approaches you could try? 130.88.139.45 (talk) 12:20, 14 February 2011 (UTC)[reply]
I suspect it's unlikely many search engines are going to boast about ignoring robots.txt because it's liable to get them banned by many sites. And the sites themselves know there is limited advantage to ignoring robots.txt. There may be some small obscure search engines that do, but it may not help you if they don't even index these sites or they're banned because everyone knows they ignore robots.txt Nil Einne (talk) 15:52, 14 February 2011 (UTC)[reply]
You could write your own. If you know which servers that you are looking at it's relatively simple process. And, probably quicker than trying to find some obscure search engine that would do that for you.