~jpastuszek/blog

1de16a57d32cbc57f1da09032d7dfdf0ec9053a7 — Jakub Pastuszek 3 years ago 5edbb48
minor text fixes
1 files changed, 12 insertions(+), 12 deletions(-)

M content/2020-08-14-js/index.md
M content/2020-08-14-js/index.md => content/2020-08-14-js/index.md +12 -12
@@ 79,7 79,7 @@ Because of the design of operating systems, an attacker exploiting browser vulne

See also:

* [Google patches Chrome zero-day under active attacks](https://www.zdnet.com/article/google-patches-chrome-zero-day-under-active-attacks/).
* [Google patches Chrome zero-day under active attacks](https://www.zdnet.com/article/google-patches-chrome-zero-day-under-active-attacks/)
* [Compromising the macOS Kernel through Safari by Chaining Six Vulnerabilities](https://github.com/sslab-gatech/pwn2own2020)
* [TiYunZong: An Exploit Chain to Remotely Root Modern Android Devices](https://raw.githubusercontent.com/secmob/TiYunZong-An-Exploit-Chain-to-Remotely-Root-Modern-Android-Devices/master/us-20-Gong-TiYunZong-An-Exploit-Chain-to-Remotely-Root-Modern-Android-Devices.pdf)



@@ 112,10 112,10 @@ Some examples of misused standards and attacks are:

For some recent examples see:

 * How your home router can be hacked from the browser: [SOHO Device Exploitation](https://blog.grimm-co.com/2020/06/soho-device-exploitation.html).
 * How Reddit "hacks" your browser to protect themselves from bots and to track you: [Reddit's website uses DRM for fingerprinting](https://smitop.com/post/reddit-whiteops/).
 * [List of well-known web sites that port scan their visitors](https://www.bleepingcomputer.com/news/security/list-of-well-known-web-sites-that-port-scan-their-visitors/).
 * [Apple declined to implement 16 Web APIs in Safari due to privacy concerns](https://www.zdnet.com/article/apple-declined-to-implement-16-web-apis-in-safari-due-to-privacy-concerns/).
 * How your home router can be hacked from the browser: [SOHO Device Exploitation](https://blog.grimm-co.com/2020/06/soho-device-exploitation.html)
 * How Reddit "hacks" your browser to protect themselves from bots and to track you: [Reddit's website uses DRM for fingerprinting](https://smitop.com/post/reddit-whiteops/)
 * [List of well-known web sites that port scan their visitors](https://www.bleepingcomputer.com/news/security/list-of-well-known-web-sites-that-port-scan-their-visitors/)
 * [Apple declined to implement 16 Web APIs in Safari due to privacy concerns](https://www.zdnet.com/article/apple-declined-to-implement-16-web-apis-in-safari-due-to-privacy-concerns/)
 * [Web Browsers still allow drive-by-downloads in 2020](https://www.bleepingcomputer.com/news/security/google-chrome-adding-malicious-drive-by-downloads-protection/)

## Malicious script delivery vectors


@@ 172,21 172,21 @@ On the server-side generated website the navigation can be done via the browsers

Another fundamental feature that easily breaks is the links. A web application often use buttons that trigger some code to run on the browser so you cannot copy the link to the page the button leads to. Similarity bookmarking or sharing links form the browser address bar is pointless as this often stays the same no matter where you are in the app. Also visited links are normally rendered differently on the website so you can see what information you already accessed in the past.

Browsers try to keep scroll positions across going back and forth in history consistent before reloads. This helps you to orient yourself in a long page quickly. Unfortunately when content is dynamically loaded this often breaks. It is very annoying for example when you are browsing products in on-line shops.
Browsers try to keep scroll positions across going back and forth in history consistent between reloads. This helps you to orient yourself in a long page quickly. Unfortunately when content is dynamically loaded this often breaks. It is very annoying for example when you are browsing products in on-line shops.

This issues can be worked around by some extra code and use of history API but it requires extra work wherewith the server-side generated website you get it for free.
This issues can be worked around by some extra code and use of history API but it requires extra work. When using the server-side generated website you get this for free.

Browsers provide accessibility features, like keyboard-based navigation. These features are crucial for people that cannot use pointing devices easily. This features work out of the box for plain HTML pages but may easily break with [JavaScript] generated navigation elements and links.

For the web developer, on the other hand, the issues are:

* Search engines cannot readily index your web applications content - Google will index HTML content as soon as it discovers it, but will take time to run your scripts to [index JavaScript generated content].
* Content of your application cannot be easily archived (e.g. by archive.org)
* Content of your application cannot be easily archived (e.g. by archive.org).
* [JavaScript] adds extra failure modes which can prevent the website from loading correctly.
* Typically all application code needs to be downloaded before it can run which may increase the [bounce rate].
* "Generically-designed REST API that tries not to mix "concerns" will produce a frontend application that has to make lots of requests to display a page" - forcing you to join data from multiple tables/API endpoints on the client-side, which is the worst performance-wise place to do it.
* They can be a resource hog for the client where hardware specification is unpredictable.
* Standard websites are easier to test.
* Non-interactive websites are easier to test.

See also: [Second-guessing the modern web](https://macwright.org/2020/05/10/spa-fatigue.html)



@@ 245,13 245,13 @@ See also: [The JavaScript Trap](https://www.gnu.org/philosophy/javascript-trap.h

### Life beyond HTTP

There are other ways of browsing the internet. Protocols like [Gemini] and [Gropher] are alternative universes where described risks do not exist by design.
There are other ways of browsing the internet. Protocols like [Gemini] and [Gropher] are alternative universes where risks I have described does not apply by design.

## Summary

[JavaScript]-based web applications are a good way to write client components for distributed software. They allow you to provide one client-side executable artefact that runs on all five of X11/Linux, macOS, Windows, Android, and iOS and is not subject to Apple Inc. censorship. But make sure you know your trade-offs as described here.
[JavaScript]-based web applications are a good way to write client components for distributed software. They allow the developer to provide one client-side executable artefact that runs on all five of X11/Linux, macOS, Windows, Android, and iOS. But make sure you know your trade-offs as described here.

If in another hand you want to present a content that is indexable, searchable, navigable and readable by default just use plain HTML/CSS and only use [JavaScript] when otherwise browser is missing a feature that you really need.
If you want to present a content that is indexable, searchable, navigable and readable by default just use plain HTML/CSS and only use [JavaScript] when otherwise browser is missing a feature that you really need.

When you are browsing the internet make sure you use proper tools to stay safe and encourage safe website designs with your choices and feedback.