It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I take some computer course and the teacher say ASCII table located very far down memory lane. While it looks to me it has to be somewhere pretty close to the CPU since it is transforms chars to hexadecimal machine readable sequences and that you can read letters even when there are no OS. I found that it is indeed in the BIOS (Basic Input Output System) does anyone can elaborate on physical location of ASCII tables? is it possible that it would be in a CPU? And how close is it indeed to CPU in Computer Architecture.
If I made any mistake please correct me we're here for learning.
The ASCII table is not managed by the BIOS but it's managed by the OS. So it does not have any physical location.
Nowadays they don't necessarily use ASCII, but use UNICODE.
Related
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
In many programming problems (e.g. some Project Euler problems) we are asked to report the answer as the remainder left after dividing the answer by 1,000,000,007.
Why not any other number?
Edit:
2 years later, here's what I know: the number is a big prime, and any answer to such a question is so large that it makes sense to report a remainder instead (as the number may be too large for a native datatype to handle).
Let me play a telepathist. 1000...7 are prime numbers and 1000000007 is the biggest one that fits in 32-bit integer. Since prime numbers are used to calculate hash (by finding the remainder of the division by prime), 1000000007 is good for calculating 32-bit hash.
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I want to learn device driver development so how to start ? Any media for beginner or something ?
The canonical reference is Linux Device Drivers 3rd Edition - although it's a few years old now, it's close enough to current kernels.
Quite a lot of stuff is not covered in here - particularly anything that's device or bus specific, or the way the kernel has been developing to support ARM SoC devices over the last few years.
I would suggest to go start learing Linux device driver under PC environment so as you can co-relate most of things you are familier with ,One approach is to go get linux source code and try to understand how kernel is being called and how single user space process is initiated from kernel space .You can also follow this below given page would be helpful for you.
http://en.wikiversity.org/wiki/Reading_the_Linux_Kernel_Sources
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
Why to restrict allowed usernames by different rules? For example why can't user have "#123 qw" username? Is there any techical difficulties or it's just about community rules?
Also is it ok to have national characters in the username? If I use "UTF-8" encoding for my website it should work just well in all browsers.
Username within a system is most of the time for the consumption of HUMANS therefore, from usability point of view it should be READABLE
And yes you can use your national characters in username and make sure you understand character encoding , storage and retrieval. You system/application should be ready to consume the selected encoding at every level e.g client-side, server-side and at database end and tools you use to manipulate with each tier e.g IDEs etc ..
So from my point of view you need some extra knowledge and efforts to handle such a system without killing Usability
I believe I can give you more than one reason but the first that comes off my head is this one.
http://www.example.com/profile/%64123%20qw
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Ok, quick points for someone who is better at searching than I am...
I know I have seen before a list of translations of common application strings like "File," "Open," "Save," "Close," and "OK," into other languages. This was not just a scrape of Google translator, but an actual "official" list based on the localized OS. It seems to me that it was on Microsoft's site, but I'm not 100% sure.
I need to translate my application into Indonesian and wanted to give our translators a head start by filling in those common terms with the standard values, but now I cannot find the web page(s)! I've spent about 15 minutes and will continue to search (and will post the answer if I find it), but if someone else knows where that is (or finds it first!), please answer.
Microsoft Language Portal
It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
I'm looking for a programming language that would scale well on multiprocessors and distributed systems, and is able to work well with the GPU for number crunching.
What do you think, is Erlang and CUDA a good match?
LE: I want to use it for image processing: feature detection, bundle adjustment and scene reconstruction; so it's fairly parallel. The GPU would do the computational intensive part and Erlang would just manage the tasks and shuffle data around.