As it is in the article itself that you cite, in general the answer is no, since observed some details:
- Scripts in separate files with blocked access (via robots.txt) will not be executed and consequently the site will not be indexed correctly.
- Servers that cannot work with the required volume of requests may impair Crawler’s ability to render pages.
- Very complex scripts, or ones that run too far from the average browser compatibility, may result in incorrect or inaccurate renderings, which is bad for both the visitor and the Googlebot.
- When scripts remove page content instead of adding, content cannot be indexed.
Thus, the best answer would still be to opt for good development practices, with or without the aid of frameworks such as Angularjs, since this, although maintained by Google, does not necessarily imply a 100% infallible rendering by Googlebot.