Even though GoogleBot actually does handle sites written in js. The big problem with ajax sites is that even if GoogleBot can execute js and handle ajax requests.
It's not exactly possible for the web crawler to know when the page finished loading. For that reason, a web crawler could load a page and index the page before it started doing ajax requests. Let say a script will get executed on page scroll. It's very likely that the google bot will not trigger every possible events.
The other problem is navigation
Since navigation can be done without page reloading, one url can map to multiple "view result". For that reason, google ask developpers to keep a copy of pages using static pages to support those pages that would be inaccessible otherwise. They are going to get indexed.
If your site can have each page accessible through a fully qualified url. Then you shouldn't have problem indexing your site.
That said, scripts are going to get run. But it's not certain that the crawler will index the page after it finished handling all scripts.
Here's a link:
GoogleBot smarter: It was written in 2010 and we can expect that the webcrawlers got much smarter since then.