ishkabible's Profile User Rating: *****

Reputation: 1701 Grandmaster
Group:
Authors
Active Posts:
5,902 (2.4 per day)
Joined:
03-August 09
Profile Views:
102,758
Last Active:
User is offline Today, 02:44 AM
Currently:
Offline

Previous Fields

Country:
US
OS Preference:
Windows
Favorite Browser:
Chrome
Favorite Processor:
Intel
Favorite Gaming Platform:
PC
Your Car:
Honda
Dream Kudos:
425

Latest Visitors

Icon   ishkabible 5k posts!

Posts I've Made

  1. In Topic: Computing MOBA video game Dota 2 using a fitness algorithm

    Posted 25 Apr 2016

    I'd like to know more about what you are trying to do here (what modi asked) but if your goal is to make a game AI using genetic programming (this is a very very very hard task btw) you could use MMR as a fitness function but it would be very very slow and would require playing many many many games to get an accurate score.
  2. In Topic: Making JARVIS

    Posted 20 Apr 2016

    So actually I know exactly how to build a Jarvis like AI. It's not even very challenging from a CS perspective.

    First off note that neural networks can approximate any function given enough hidden layers and the nurons used in neural networks are actually quite a bit more precise and likely slightly more powerful than the ones in our head computationally speaking. We even have decent algorithms for training recurrent networks now so we can pretty evenly match what our brains are doing with deep recurrent neural networks. So basically if we build a big enough one, we can just directly simulate a brain.

    How big does this network need to be? According to wikipedia 86000000000. According to this we managed using some insane computational power to train a 160,000,000,000 network but that network was not recurrent so we'll need a bigger computer to handle the recurrent case. The point is modern super computers are closing in on the level of computation we need to train a network large enough to compete with humans.

    So step 1 to making Jarvis: construct the worlds largest super computer (maybe only like 2 times bigger than titan or so just to hold the network and compute with it).

    The next issue is training data. Humans really have computers beat here. First off an evolutionary process set us up with a bunch of super crazy good unsupervised learning that comes preloaded into our brains. Luckily we are pretty good at evolutionary computation so presumably we could, with a very long time and a lot lot more compute power simulate millions of years of evolution. We'll takle that next. For now lets consider the kind of data humans can train their networks on. A 5 year old has about 3 years of solid, hi-def, hi-res, AV data plus feeling. We are talking petabytes of data to just get 5 year old. But really we can handle that kind of data in a relatively small data center nowadays. I think my campuses super computer can handle a few petabytes pretty efficiently and on top of that I'm pretty sure humans throw out at least a petabyte of data over that time (thanks to our sweet ass unsupervised learned brains). But one person's data won't do. We need many many people's data so that we don't curve fit the evolutionary process I'll explain next.

    So step 2. Aquire many many petabytes upon petabytes of high quality relevant data over years and years.

    As mentioned, our brains are fucking awesome already out of the box. Our brains are a macro structure that we evolved over years and years. This kind of unsupervised learning is the real crux of deep learning. Without out you need much much much larger networks to accomplish the same tasks. So 1 way out of this is to just really up the anti many many fold on step 1. We evolved this structure over millions of years of evolution. The cool thing is we can simulate this using evolutionary computation and over hundreds of thousands of generations of large population sizes each time performing the expensive operation of training a huge network on a lifetime of data. We could actually theoretically figure out a pretty good structure with this method with current technology.

    So step 3. Spend absolutely absurd amounts of time figuring out what the structure of the network should look like. We'll probably need thousands of the super computers from step 1 to do this in a timely manner but we can trade computational resources for time here so pick your poison.

    Congrats, you have made something about as useful as a 5 year old!

    TL;DR
    Step 1: Build the largest super computer ever built by many orders of magnitude
    Step 2: Crowd source very detailed annotated training data to account for a huge number of humans life's worth of data
    Step 3: Spend a huge amount of time, maybe a hundred years but it's very hard to say, finding the best structure of this network
    Step 4: Profit

    It should be noted that there are a lot of complexities that I have not dealt with here as well. Further still human brains are plastic, meaning that during training the structure of the network can change! I don't know of any research that allows for this kind of thing to happen but it can be compensated for by simply adding more layers and more connections between them (thus adding a lot of time and resources to our process here). We can even cut out step 3 and just train a very very very large network so that structure doesn't matter! We'e still going to need a ton of compute capability.

    Really I imagine that in the next 100 years if all the worlds governments came together we could pull this off. All for the production of a 5 year old that takes a data center to run and like only a million dollars a day to power with energy. Sounds worth it, right?
  3. In Topic: What's everyone been doing the last few years?

    Posted 5 Apr 2016

    I'm so happy Skaggles popped back in so that I could see that quote of mine from when I was 16.
  4. In Topic: BusinessInsider swings and misses with "must know languages"

    Posted 19 Mar 2016

    View Postmacosxnerd101, on 12 February 2016 - 07:27 AM, said:

    View Postmodi123_1, on 29 January 2016 - 12:25 PM, said:

    Reminds me of the amusing hierarchy of superiority..

    Spoiler


    Yeah, posets preorders!


    FTFY
  5. In Topic: Facial Hair

    Posted 19 Mar 2016

    I'm rocking a 5-o'clock shadow that took me a mere week to grow. I'm planning on joining zz top after no-shave november.

My Information

Member Title:
spelling expret
Age:
22 years old
Birthday:
September 8, 1993
Gender:
Location:
Marion Ks
Interests:
making and playing video games, language design, IRC bots, C++ templates, learning new languages
Forum Leader:
C++, Assembly
Full Name:
Jake Ehrlich
Years Programming:
4
Programming Languages:
Experienced: C/C++, Lua
Ok: D, Java, Haskell, Python, x86 assembly
Learning: PHP

Contact Information

E-mail:
Private
Website URL:
Website URL  http://

Comments

  • (6 Pages)
  • +
  • 1
  • 2
  • 3
  • Last »
  1. Photo

    Martyr2 Icon

    22 Aug 2013 - 10:26
    Just wanted to thank you for my 4000th rep vote back on Aug 12th. Appreciate it!
  2. Photo

    Precise Icon

    18 May 2013 - 13:32
    Hello I would like to speak to you about something important. About a game. email me at [email protected] I dont know how else to get in contact with you. Thanks.
  3. Photo

    Xupicor Icon

    24 Nov 2012 - 01:19
    No witty rep meaning at the moment. Though 1415 AD was memorable, that's the year Jan Hus (John Huss) was burned, an event which pushed his followers (later known as Hussites) to fight. Hussite Wars were a bloody period, but mighty interesting. ;)
  4. Photo

    Xupicor Icon

    21 Nov 2012 - 13:24
    Now your rep says you're "lala" :P That could mean "a doll" where I'm from. Yes, it has more than meaning here too... ;)
    So, how's it going, doll? ;p
  5. Photo

    ishkabible Icon

    10 Sep 2012 - 10:42
    Hey, my rep says I'm 'leet'
  6. Photo

    BenignDesign Icon

    07 Jul 2012 - 05:18
    spelling expret works too!
  7. Photo

    BenignDesign Icon

    06 Jul 2012 - 18:48
    Then I would go legitimately subtle... like "speling expert" or "spelling export" .... drop or change ONE letter... hence, subtle.
  8. Photo

    BenignDesign Icon

    05 Jul 2012 - 10:31
    I don't remember what it said in the first place.
  9. Photo

    BenignDesign Icon

    01 Jul 2012 - 21:15
    Not nearly subtle enough.
  10. Photo

    ishkabible Icon

    30 Jun 2012 - 21:20
    perhaps...it has to be subtle enough though
  11. Photo

    BenignDesign Icon

    30 Jun 2012 - 16:23
    Your title would be WAY funnier if there was a spelling error in it.
  12. Photo

    hulla Icon

    11 Mar 2012 - 08:45
    1K rep :)
  13. Photo

    r.stiltskin Icon

    04 Dec 2011 - 20:19
    Been busy.
  14. Photo

    hulla Icon

    21 Nov 2011 - 08:23
    What's an esoteric hack? O.O
    What's a purely functional programming language? I've never used one before but they sound interesting . . .
  15. Photo

    ishkabible Icon

    22 Oct 2011 - 20:02
    im a "master" now!!
  • (6 Pages)
  • +
  • 1
  • 2
  • 3
  • Last »