7 Replies - 1270 Views - Last Post: 27 April 2014 - 08:35 PM

#1 masterori   User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 69
  • Joined: 27-January 14

best way to generate a number from a string?

Posted 19 April 2014 - 12:01 PM

I was wondering what would be the best way to generate a number from a string of characters/letters? The numbers don't have to be 100% unique.

For example:
"google/testing.something://" //--> generates a # like 425
"hello.oadf/dskf:.#$%2test" //--> generates a # like 320
"justdoingtest:/\" //--> generates a # like 320


Is This A Good Question/Topic? 0
  • +

Replies To: best way to generate a number from a string?

#2 modi123_1   User is online

  • Suitor #2
  • member icon



Reputation: 14146
  • View blog
  • Posts: 56,698
  • Joined: 12-June 08

Re: best way to generate a number from a string?

Posted 19 April 2014 - 12:18 PM

Convert each character to ascii values and add them up.
Was This Post Helpful? 0
  • +
  • -

#3 Blindman67   User is offline

  • D.I.C Addict
  • member icon

Reputation: 140
  • View blog
  • Posts: 620
  • Joined: 15-March 14

Re: best way to generate a number from a string?

Posted 19 April 2014 - 01:39 PM

As computers can only handle numbers, they store strings as a series of numbers. Each number representing each character. So "A" is stored as 65, "B" as 66, lowercase start at "a" = 97, "b" = 98. etc....

In the past strings where stored in bytes. Each byte has 8 bits, and each bit representing either zero or one. So the total different possible combinations of bits in a byte is 2*2*2*2*2*2*2*2 or 2 to the power of 8 which is equal to 256 different combinations. This encoding is called ASCII

This was plenty in the past, but now computers have to not only represent each western character, but all the Chinese, Japanese, Arabic, etc, etc characters as well, and 256 characters just won't do the job.

To encode all these possibilities web pages and web based languages use Unicode as a standard to encode characters. Unicode stores each character as 2 bytes, giving 16 bits. Thus 2^16 = 65536 (^ is power) possible combinations.

It is sometimes handy to have a computer convert from a character to the number representing it or from a number to the character. Javascript is provided two functions that will do this for you.

charCodeAt() and fromCharCode(). You use them as below.

var str = "This is a string";
var ch = str.charCodeAt(8); // get the char code of the 8th character a = 97

var num = String.fromCharCode(97); // returns the string "a"

// It is also handy to know the number of characters in a string
// this is done via a property of the string called length;

var len = str.length;  // returns 16 

Some characters in the Unicode representation do not encode into characters but represent special control characters. Like tab, line feed, carriage return, backspace, and form feed. These come from the days when computers did not have screens but used printers. We still use them today.

To put these characters into a string we use the \ character. This tells Javascript that the next character represents a special character.

  • "\n" new line.
  • "\r" carriage return.
  • "\f" form feed.
  • "\t" tab.
  • "\b" backspace.


The backslash character is also used to insert " and ' into a string, because the " and ' are used to represent the start and end of a string. You will have problems when you use these characters inside the string. To solve this use the /" or /'

var str = "Dave said "Hi there" "; //will cause the program to fail.
// to fix this 
var str = "Dave said \"Hi there\""; 
// or 
var str = 'Dave said "Hi there"'; 

// and if you need a backslash in your string
var str = "this \\ is a backslash"; // shows only one \


I say this Masterori because your 3rd string has a back slash in it and is not a correctly formatted string. You can see even D.I.C. code segment does not style the last string correctly. And you first string has something gin it that caused that line to stuff up its formatting as well. I am trying to find out what you typed to cause it to underline the last part of that string.

Hope this helps.

This post has been edited by Blindman67: 19 April 2014 - 01:40 PM

Was This Post Helpful? 0
  • +
  • -

#4 felgall   User is offline

  • D.I.C Regular

Reputation: 68
  • View blog
  • Posts: 365
  • Joined: 22-February 14

Re: best way to generate a number from a string?

Posted 19 April 2014 - 06:34 PM

View PostBlindman67, on 20 April 2014 - 06:39 AM, said:

Each byte has 8 bits, and each bit representing either zero or one. So the total different possible combinations of bits in a byte is 2*2*2*2*2*2*2*2 or 2 to the power of 8 which is equal to 256 different combinations. This encoding is called ASCII


ASCII actually only uses 7 of the 8 bits so as to allow the last bit to be used as a checkbit when needed) - there are only 127 ASCII characters.

View PostBlindman67, on 20 April 2014 - 06:39 AM, said:

To encode all these possibilities web pages and web based languages use Unicode as a standard to encode characters. Unicode stores each character as 2 bytes, giving 16 bits. Thus 2^16 = 65536 (^ is power) possible combinations


You have the count wrong for Unicode as well. There are nearer to 100000 unicode characters currently defined. Unicode can use one, two or FOUR bytes per character. Javascript currently only supports the first 65536 unicode characters but ES6 is adding support for the rest of the unicode characters into Javascript. HTML already supports UTF-32 which is the full version of unicode supporting over 100000 different characters.
Was This Post Helpful? 0
  • +
  • -

#5 Blindman67   User is offline

  • D.I.C Addict
  • member icon

Reputation: 140
  • View blog
  • Posts: 620
  • Joined: 15-March 14

Re: best way to generate a number from a string?

Posted 20 April 2014 - 01:11 AM

LOL well I was trying to keep it simple for the OP.

The 8th bit was specifically reserved in the ASCII standard for parity bit (not check bit if we must quibble) or send/receive control line. ASCII these days commonly (and incorrectly) refers to the full 8 bits. The correct reference should be ASCII-8 and the mime type US-ASCII (actually is UTF-8) includes the 8th bit and can not be arbitrarily assigned to even or odd parity or dumped if used as line control.

Many home computers in the late 70's and 80's used ASCII as a generic term, seldom conforming to the ASCII standard mapping and bit depth.

There are also many generic common usage terms that refers to ASCII in its non standard and mostly 8 bit form. ASCII art, ASCII file, Hex to ASCII all require the 8th bit.

I will stick to the meaning in common usage and hold my non standard 8th bit in high order regard. LOL
Was This Post Helpful? 0
  • +
  • -

#6 ArtificialSoldier   User is offline

  • D.I.C Lover
  • member icon

Reputation: 2039
  • View blog
  • Posts: 6,247
  • Joined: 15-January 14

Re: best way to generate a number from a string?

Posted 21 April 2014 - 10:43 AM

Isn't the 8th bit also used for "extended ASCII"? That set is a staple for ASCII art, that's where all the lines are.
Was This Post Helpful? 0
  • +
  • -

#7 masterori   User is offline

  • D.I.C Head

Reputation: 0
  • View blog
  • Posts: 69
  • Joined: 27-January 14

Re: best way to generate a number from a string?

Posted 27 April 2014 - 07:29 PM

Ah thank you for your help.

Let's say I want to limit the range in which the number generated is in, would a combination of charCodeAt() and Math.max()/min() work, or would I need an algorithm?

An example: "hello this is a string" --> guarantee the ASCII numbers add up a number within 1 - 1000
Was This Post Helpful? 0
  • +
  • -

#8 Blindman67   User is offline

  • D.I.C Addict
  • member icon

Reputation: 140
  • View blog
  • Posts: 620
  • Joined: 15-March 14

Re: best way to generate a number from a string?

Posted 27 April 2014 - 08:35 PM

If you use min and max as you will not know if you have limited the number or not.
Just use two if statements
var min = 1;
var max = 1000;
var message = "Number in valid range.";
if(val < min){
    val = min;
    message = "Number was to low.";
}else
if(val > max){
    val = max;
    message = "Number was to high.";
}
.. display the message ...


If its not important that you know if the number has been truncated then using min and max is a good solution to the problem.
Was This Post Helpful? 0
  • +
  • -

Page 1 of 1