Commented Links for 2020-06-112020-06-11 #links #python #infrastructure #pong #elm #alan kay #objects #paywalls #brave #notes #rust #writing #concepts #css #algorithms
Infrastructure as (Python) Code, Pong in Elm, Alan Kay and Objects, Paywalls, Brave, Note Taking, Rust From Scratch, Writing, 51 Concepts You Should Know, MAD in CSS, Magic Algorithms.
I've been, for some time, postponing creating an Ansible playbook to set up my DigitalOcean droplet -- for no real reason besides I'm lazy -- and now there is this pure-Python way to set up an environment, and now I'll probably not start two set ups -- again, because I'm lazy.
There is something deeply satisfying in reading a tutorial that takes from the very start and explain every little step needed.
I just miss the "If you do this, it won't work/will crash" parts.
Ah, I just love this kind of discussion, that goes over and over and over -- I basically had to hear that every time I went to the local Elixir Meetup.
The gist is: When Alan Kay was talking about "object-oriented design", he was talking about the communication between objects, not about encapsulation, inheritance and so on.
Some of the words in the post are quite strong -- "I would be happier in a world where major newspapers ceased to exist, compared to the world where they exist but their articles are paywalled" -- but the actual point being made is quite true: Paywalled content usually take a route of "let me twist your imagination/curiosity so you pay to actually see the content". And maybe the content was produced only for taking your curiosity and produce absolutely nothing of actual research or content.
At the same point, if content was actually good, based on research, and had actual content that would live on (and not something that was interesting for this week only and would be completely forgotten in the next), then paywalling content would be worth paying.
That was not the first time Brave was caught doing something morally questionable with users content. At some point, one would wonder if they share a referral link to some service to a friend -- say, taking advantage of some rewards on DigitalOcean, for example -- and instead of giving some reward to them, it give it to the Brave company so they can run their servers.
In a way, it just shows how hard it is to produce a browser these days, even if you take some previously existing codebase and improve it. But doing morally questionable actions also seem the way of most companies take about the internet these days...
I've been thinking about a way to improve my note-taking workflow, so the knowledge is not completely lost. And a lot about this "Zettelkasten" way of taking notes is appearing on my timeline from time to time.
So it is nice that a simple introduction exists, although I still have to start doing it so.
If you're interested in Rust but have no idea how to start or where to go, Luca Palmieri is writing a "book" about the whole process.
Tips on how to write gooder1. The tips are pretty precise and direct.
Not much as "ideas", but more like "concepts".
Also, as any good list, there are 51 concepts, not just 50.
Ah, the last page of MAD. I do remember trying over and over to make the folding correct, so the proper picture would appear. It's kind obvious that, once we automated stuff, there should be a way to do this.
On the other hand, I have the same opinion about the same very complex CSS example: Ok, now center the text in this box.
Ok, let's discuss this for a bit: The one showing people that COVID is a Chinese government weapon gone rogue, racism is not a problem and white people also suffer racism, and decapitating statues is wrong is not Zuckerberg doing, but "the algorithm".
Here is the problem, though: Although Zuckerberg was not the one who created "the algorithm", people who work for him did. Also, "the algorithm" didn't simply appear and decided what do you like, someone put it there. This is what most people get wrong about artificial intelligence and "algorithms": They don't simply appear, someone put things there and they act towards what that person put there.
Take, for example, the fact that Google was tagging black people as "gorillas" in their Photos. It was not a "problem with the algorithm"; someone working at Google decided black people weren't import enough to add them in the training set -- worse, that person (or group of people) didn't even though that adding black people in the training set was something worth or even missing.
This is not the algorithm, is people. People are behind every single "magic" algorithm out there.
And although Zuckerberg was probably not related to the construction of the algorithm, the people were the problem. Not the algorithm.
Yes, I wrote that wrong on purpose.
This post was built with the help of