Software-Tester, Aikidoka
9 stories
·
0 followers

The 7 Best COVID-19 Resources We’ve Discovered So Far

1 Share

With all eyes on the COVID-19 pandemic and how its impact will be felt over the coming weeks and months, people are being bombarded with all kinds of noise and speculation.

Between a deadly virus, looming economic effects, and numerous government shutdowns, it’s clear that a fertile breeding ground has been created for misinformation, rumors, conspiracy theories, hot takes, and other potentially misleading content.

7 Indispensable COVID-19 Resources

At Visual Capitalist, it’s our goal to use data-driven visuals to explain the world around us.

In the last week alone, we’ve had more than 10 million people visit our site — many of them trying to understand more about COVID-19 and its effects on the economy and society.

With that in mind, we thought we’d curate a list of quality information on the virus and its impact. These COVID-19 resources are all from fact-driven, reliable sources, with some of them even being created by our in-house team and shared to our free daily mailing list.

On the below list, we start with the more contextual resources (understanding how the virus works, pandemic history, etc.) and then progress to real-time dashboards and up-to-date data.

Click any image below to see the full resource or dashboard. Many are updated daily or in real-time.

1. How Coronaviruses Work

How Coronaviruses Work

What is a coronavirus, and how does COVID-19 fit into the mix?

This educational scrolling infographic by SCMP walks you through some of the more familiar types of coronaviruses, how they spread, and how they affect the human body.

It also relates COVID-19 to other coronaviruses that cause diseases such as Mers, Sars, and even the common cold.

2. The History of Pandemics

The History of Pandemics

On March 11th, the World Health Organization declared COVID-19 a pandemic.

In this infographic, we look at the data to show you the history of pandemics — all the way from the Black Death to how the current COVID-19 situation. It helps give the historical context on how bad a pandemic can be. It’s also updated every day so you can see how COVID-19 compares to the impact of these previous events.

3. Coronavirus Simulator: Limiting the Exponential Spread

Coronavirus Simulator

Why does the virus spread at an exponential rate, and what techniques can be used to mitigate that spread?

This fantastic interactive page by the Washington Post actively simulates what happens when the virus spreads normally, contrasting it to how it may spread in a forced quarantine environment or when social distancing is practiced.

4. Real-time COVID-19 Map

Johns Hopkins: Real-time Coronavirus Map

If you haven’t seen this useful real-time dashboard by Johns Hopkins University yet, it’s worth bookmarking right now.

We check the resource every day, and it has the latest numbers for COVID-19 cases, deaths, recoveries, and more — and it’s all sorted by country and/or state and province. Importantly, it also updates in real-time, so you always know you are getting the latest numbers.

5. Which Countries are “Flattening the Curve”?

Our post on which countries are “flattening the curve” has had over a million views in the last week alone, and it features the above interactive graph from Our World in Data.

Go to the post itself to see a bigger version of the logarithmic chart, which plots the progress of different countries in flattening the curves of COVID-19 infections. The interactive chart updates daily based on the latest numbers, and you can actually search for any country by using the “Search” button. Using the filters on the right side, you can also sort by region as well.

6. Tracking the Coronavirus: The Latest Figures

Tracking the Coronavirus: The Latest Figures as it Spreads

Even though the Financial Times is a subscription-based website, it recently published this useful COVID-19 dashboard and made it accessible to everyone.

It features various charts and tables on the countries affected, as well as ongoing assessments on the economic damage caused by the virus. Like many of the other COVID-19 resources featured on this list, it is updated on a daily basis.

7. COVID-19 Stats and Research

The above graphic is one of many available on Our World in Data, a fantastic initiative led by economist Max Roser.

Their coronavirus research page has tons of stats, citations, and data for those that want to dive deeper into the situation. It’s also updated very regularly.

Bonus: The Coronavirus Explained, and What You Should Do

While this is less data-driven than the other pieces of content, this animated video by Kurzgesagt still provides a handy explainer on how the virus works.

It’s about eight minutes long, and might help you fill other knowledge gaps.

Please Share These Resources

At a time when misinformation can be dangerous and even deadly, it is worth spreading the above COVID-19 resources to your friends, family, and colleagues.

Many of the above resources are updated daily or they contain evergreen information, meaning they are not going to go out of date any time soon.

Wishing you a safe next few months,
– The Visual Capitalist team

PS: If you have any other great resources to share, please post them in the comments!

Subscribe to Visual Capitalist

Thank you!
Given email address is already subscribed, thank you!
Please provide a valid email address.
Please complete the CAPTCHA.
Oops. Something went wrong. Please try again later.

The post The 7 Best COVID-19 Resources We’ve Discovered So Far appeared first on Visual Capitalist.

Read the whole story
rogulogu
1701 days ago
reply
Hamburg
Share this story
Delete

Critical security announcement: Suspicious git activity detected

1 Comment

We've learned of suspicious git activity using valid credentials (a password or personal access token) on GitLab. We identified the source based on a support ticket filed by Stefan Gabos yesterday, and immediately began investigating the issue.

The breaches seem to rely on the attacker having knowledge of the affected users passwords in order to wipe their Git repositories and hold them for ransom. We have notified affected GitLab users and are working as quickly as possible to resolve the issue.

“As a result of our investigation, we have strong evidence that the compromised accounts have account passwords being stored in plaintext on a deployment of a related repository. We strongly encourage the use of password management tools to store passwords in a more secure manner, and enabling two-factor authentication wherever possible, both of which would have prevented this issue.” - Kathy Wang, Senior Director, Security

How you can protect yourself

These breaches seemed to rely on the attacker having knowledge of the affected user’s password. We highly recommend strong passwords and unique passwords for everything (using a good password manager helps manage this). We strongly recommend all GitLab users enable two-factor authentication and use SSH keys to strengthen your GitLab account.

You may further protect your groups by applying the “Require all users in this group to setup Two-factor authentication” setting in the Group settings under “Permissions, LFS, 2FA”.

In this case, it can help to prevent a breach.

Mitigation

We believe that no data has been lost, unless the owner/maintainer of the repository did not have a local copy and the GitLab copy was the only one. In some cases, repository files were changed. After updating account credentials, we recommend making use of git commands to restore your repository to its previous state. If you have a full current copy of the repository on your computer, you can force push to the current HEAD of your local copy using:

git push origin HEAD:master --force

Otherwise, you can still clone the repository and make use of:

As this is related to the use of git, GitLab does not have its own documentation or examples, but we have found these articles that may be of use:

Details

On May 2, 2019 at approximately 10:00pm GMT GitLab received the first report of a repository being wiped with a single file left in place that demanded a bitcoin ransom be paid for the return of data:

To recover your lost data and avoid leaking it: Send us 0.1 Bitcoin (BTC) to our Bitcoin address 1ES14c7qLb5CYhLMUekctxLgc1FV2Ti9DA and contact us by Email at admin@gitsbackup.com with your Git login and a Proof of Payment. If you are unsure if we have your data, contact us and we will send you a proof. Your code is downloaded and backed up on our servers. If we dont receive your payment in the next 10 Days, we will make your code public or use them otherwise.

We began to receive multiple reports, and were able to search through logs and repositories to determine the extent of the impact. A few repositories had the ransom threat left behind, some repositories were simply wiped, and a few accounts appeared to be successfully accessed by the attacker but not modified. All total, 131 users and 163 repositories were, at a minimum, accessed by the attacker. Affected accounts were temporarily disabled, and the owners were notified.

We noticed the following items in reference to this incident:

  • The source IP of the attacks came from the 185.234.216.0/24 range.
  • The attacker appeared to use some type of “update script” in an attempt to perform the accesses, and the nature of the individual accesses strongly suggested the use of plaintext passwords that were locally stored.
  • Virtually all of the repositories were private repositories.
  • None of the accounts impacted had two-factor authentication enabled.

Since not all of the accesses resulted in both a repository wipe and a ransom note, this suggests that the attacker’s update script was possibly not working properly. This could be a result of a generic script being used against GitLab as well as GitHub and Bitbucket.

Conclusion

We are continuing our investigation, and if we uncover more details that we feel will benefit our users and the security community at large, we will communicate that information as quickly and as transparently as possible. We are constantly looking for ways to improve our security and would appreciate any comments and questions you might have.

Read the whole story
rogulogu
2021 days ago
reply
"[...] enable two-factor authentication and use SSH keys to strengthen your GitLab account."
Hamburg
Share this story
Delete

Round Earth Test Strategy

1 Share

The “test automation pyramid” (for examples, see here, here, and here) is a popular idea, but I see serious problems with it. I suggest in this article an alternative way of thinking that preserves what’s useful about the pyramid, while minimizing those problems:

  1. Instead of a pyramid, model the situation as concentric spheres, because the “outer surface” of a complex system generally has “more area” to worry about;
  2. ground it by referencing a particular sphere called “Earth” which is familiar to all of us because we live on its friendly, hospitable surface;
  3. illustrate it with an upside-down pyramid shape in order to suggest that our attention and concern is ultimately with the surface of the product, “where the people live” and also to indicate opposition to the pyramid shape of the Test Automation Pyramid (which suggests that user experience deserves little attention);
  4. incorporate dynamic and well as static elements into the analogy (i.e. data, not just code);
  5. acknowledge that we probably can’t or won’t directly test the lowest levels of our technology (i.e. Chrome, or Node.js, or Android OS). In fact, we are often encouraged to trust it, since there is little we can do about it;
  6. use this geophysical analogy to explain more intuitively why a good tooling strategy can access and test the product on a subterranean level, though not necessarily at a level below that of the platforms we rely upon.

Good analogies afford deep reasoning.

The original pyramid (really a triangle) was a context-free geometric analogy. It was essentially saying: “Just as a triangle has more area in its lower part than its upper part, so you should make more automated tests on lower levels than higher levels.” This is not an argument; this is not reasoning. Nothing in the nature of a triangle tells us how it relates to technology problems. It’s simply a shape that matches an assertion that the authors wanted to make. It’s semiotics with weak semantics.

It is not wrong to use semantically arbitrary shapes to communicate, of course (the shapes of a “W” and an “M” are opposites, in a sense, and yet nobody cares that what they represent are not opposites). But at best, it’s a weak form of communication. A stronger form is to use shapes that afford useful reasoning about the subject at hand.

The Round Earth model tries to do that. By thinking of technology as concentric spheres, you understand that the volume of possibilities– the state space of the product– tends to increase dramatically with each layer. Of course, that is not necessarily the case, because a lot of complexity may be locked away from the higher levels by the lower levels. Nevertheless that is a real and present danger with each layer you heap upon your technology stack. An example of this risk in action is the recent discovery that HTML emails defeat the security of PGP email. Whoops. The more bells, whistles, and layers you have, the more likely some abstraction will be fatally leaky. (One example of a leaky abstraction is the concept of “solid ground,” which can both literally and figuratively leak when hot lava pours out of it. Software is built out of things that are more abstract and generally much more leaky than solid ground.)

When I tell people about the Round Earth model they often start speaking of caves, sinkholes, landslides, and making jokes about volcanoes and how their company must live over a “hot spot” on that Round Earth. These aren’t just jokes, they are evidence that the analogy is helpful, and relates to real issues in technology.

Note: If you want to consider what factors make for a good analogy, Michael Bolton wrote a nice essay about that (Note: he calls it metaphor, but I think he’s referring to analogies).

The Round Earth model shows testing problems at multiple levels.

The original pyramid has unit testing at the bottom. At the bottom of the Round Earth model is the application framework, operating environment, and development environment– in other words, the Platform-That-You-Don’t-Test. Maybe someone else tests it, maybe they don’t. But you don’t know and probably don’t even think about it. I once wrote Assembler code to make video games in 16,384 bytes of memory. I needed to manage every byte of memory. Those days are long gone. Now I write Perl code and I hardly think about memory. Magic elves do that work, for all I know.

Practically speaking, all development rests on a “bedrock” of assumptions. These assumptions are usually safe, but sometimes, just as hot lava or radon gas or toxified groundwater breaks through bedrock, we can also find that lower levels of technology undermine our designs. We must be aware of that general risk, but we probably won’t test our platforms outright.

At a higher level, we can test the units of code that we ourselves write. More specifically, developers can do that. While it’s possible for non-developers to do unit-level checks, it’s a much easier task for the devs themselves. But, realize that the developers are working “underground” as they test on a low level. Think of the users as living up at the top, in the light, whereas the developers are comparatively buried in the details of their work. They have trouble seeing the product from the user’s point of view. This is called “the curse of expertise:”

“Although it may be expected that experts’ superior knowledge and experience should lead them to be better predictors of novice task completion times compared with those with less expertise, the findings in this study suggest otherwise. The results reported here suggest that experts’ superior knowledge actually interferes with their ability to predict novice task performance times.”

[Hinds, P. J. (1999). The curse of expertise: The effects of expertise and debiasing methods on prediction of novice performance. Journal of Experimental Psychology: Applied, 5(2), 205–221. doi:10.1037/1076-898x.5.2.205]

While geophysics can be catastrophic, it can also be more tranquil than a stormy surface world. Unit level checking generally allows for complete control over inputs, and there usually aren’t many inputs to worry about. Stepping up to a higher level– interacting sub-systems– still means testing via a controlled API, or command-line, rather than a graphical interface designed for creatures with hands and eyes and hand-eye coordination. This is a level where tools shine. I think of my test tools as submarines gliding underneath the storm and foam, because I avoid using tools that work through a GUI.

The Round Earth model reminds us about data.

Data shows up in this model, metaphorically, as the flow of energy. Energy flows on the surface (sunlight, wind and water) and also under the surface (ground water, magma, earthquakes). Data is important. When we test, we must deal with data that exists in databases and on the other side of micro-services, somewhere out in the cloud. There is data built into the code, itself. So, data is not merely what users type in or how they click. I find that unit-level and sub-system-level testing often neglects the data dimension, so I feature it prominently in the Round Earth concept.

The Round Earth model reminds us about testability.

Complex products can be designed with testing in mind. A testable product is, among other things, one that can be decomposed (taken apart and tested in pieces), and that is observable and controllable in its behaviors. This usually involves giving testers access to the deeper parts of the product via command-line interfaces (or some sort of API) and comprehensive logging.

Epigrams

  • Quality above requires quality below.
  • Quality above reduces dependence on expensive high-level testing.
  • Inexpensive low-level testing reduces dependence on expensive high-level testing.
  • Risk grows toward the user.

 

 

Read the whole story
rogulogu
2263 days ago
reply
Hamburg
Share this story
Delete

Aikido Einsteigerkurs im September 2018

1 Share

Im September bietet die Aikido-Gruppe der ASG Yawara an, die japanische Kampfkunst Aikido in einem vier-wöchigen Einsteigerkurs kennen zu lernen.

Die Kurs-Einheiten finden jeweils freitags von 19:30 bis 21:30 Uhr im Dojo der ASG Yawara , Hamburger Straße 16 in Ahrensburg, statt. Der erste Termin ist am 7. September 2018.

Der Kurs bietet Einblick in der grundlegenden Konzepte und Übungsformen des Aikido:

  • Das Aufwärmen löst Muskelspannungen, verbessert die Beweglichkeit, stärkt die Rumpfmuskulatur.
  • Roll-Übungen vermitteln verletzungsfrei zu fallen. Sie erhöhen die Geschicklichkeit und das Gefühl für den eigenen Körper.
  • Atemübungen koordinieren Körper und Geist.
  • Aikido-Techniken lenken den Angriff so, dass der Verteidiger die gegen sich gerichtete Kraft kontrollieren kann. Aus dem Gegeneinander wird ein miteinander bewegen.
  • Aikido ist die friedliche Kampfkunst. Wir entwickeln unsere Fähigkeiten ohne Wettkampf, ohne Aggression.

Im Kurs lernt man zusammen mit anderen Neulingen Aikido. Dadurch ist man mit Anfangsschwierigkeiten und Fragen nicht allein. Gleichzeitig unterstützen die erfahrenen Mitglieder der Aikido-Gruppe die Einsteiger im Training.

Die Teilnahme am Kurs ist kostenlos.

Wie beim Probetraining genügen als Trainingskleidung Jogginghose, Sweatshirt und ein paar Latschen.

Fragen und die Anmeldung zum Kurs können per E-Mail an die Kontaktadresse aikido@kampfkunst-ahrensburg.de geschickt werden.

Aikido bei Yawara Ahrensburg



Read the whole story
rogulogu
2276 days ago
reply
Hamburg
Share this story
Delete

Am nächsten Samstag ist es so weit, der MMT steht vor der Tür!

1 Comment

Yawara_Plakat_DinA2

Der geplante Ablauf sieht wie folgt aus:

13:00 Uhr Einlass

13:30 Uhr Jiu Kinder

14:15 Uhr Kung Fu Kids

15:00 Uhr Jiu Jitsu

16:00 Uhr Krav Maga

16:45 Uhr Kung Fu

17:30 Uhr Aikido

18:15 Uhr Karate

19:00 Uhr Schluss





Read the whole story
rogulogu
2750 days ago
reply
Mit-Mach-Tag im Kampfkunst Verein Yawara in Ahrensburg
Hamburg
Share this story
Delete

Add SSL to your personal website

1 Share
Give yourself a gift this holiday season, and add SSL to your personal site. The web is going secure, and it's time to be part of the solution. This article details how I turned on SSL + custom domains, plus automated deploys, for my personal site for the cost of a domain (which I already had) and $5/year. Read on!


Turns out, it's easier (and more affordable!) than you think to add SSL to your website. But first, why bother? There are lots of reasons why you should care about adding SSL:

  • Search engines are preferring SSL
  • New web APIs (like service worker) mandate SSL
  • Users trust SSL
  • Bonus: SSL can help enable HTTP/2 on some servers
Your setup will vary, so look for the easiest/shortest path to SSL for your particular site. Everyone has factors they want to optimize for. Here's what I was trying to optimize, as I looked for a solution.

I needed a solution that was:
  • Affordable
    • The solution should be very, very affordable. Affordable, in this context, means "as close to free as possible". My personal website is extremely low traffic. It doesn't make sense for me to pay a lot of money for something so small.
  • Easy
    • I don't have time to manage my personal site. The solution has to be simple and quick.
  • Sustainable
    • Because I don't have time to manage my site, I need a solution that is "set and forget" for as long as possible.
  • GitHub friendly
    • My site's source is on GitHub, and I needed a hosting+SSL solution that integrated with a "push to deploy" model.
  • Static file friendly
    • My personal site is extremely simple. I don't need anything other than a few static files.
  • Works with Custom Domains
    • I want to use my own domain.
I looked at a lot of options. Here's what didn't work for me, for a variety of reasons. They may work for you.

  • Managing my own VPS (e.g. on Digital Ocean or EC2).
    • This option completely rules out easy and sustainable. Manual configuration of servers, or keeping linux distros up to date, are two things I absolutely do not want to be doing.
  • Google Cloud Storage
    • GCS does serve static files, and even supports custom domains. However, they don't support custom domains and SSL. Bummer. It also doesn't support basic static file hosting feature like redirects, so it's probably not an option anyway.
  • GitHub Pages
    • Great integration with GitHub (of course :), but don't they don't support custom domains and SSL. They support custom domains, and SSL via the github.io domain, just not custom domains and SSL together at the same time.
  • Amazon Web Services
    • I'm not aware of an AWS product that meets my needs. Maybe they have some awesome static file server with custom domains and SSL and git integration? I didn't see one.
  • Firebase Static Hosting
    • This open is actually really good, and it was almost my solution. Their setup is very simple, they support custom domains and SSL, and they have decent GitHub integration (it requires just a little bit of scripting to deploy after a push). The only downside is that it costs $5/month for custom domains (but, the certificate is free and provided by Firebase). $60/year is a small price, especially considering the Firebase gives you an SSL certificate for free! Also, their static hosting is very good: they give you configuration options for redirects, custom 404 pages, and more. It's a very good option for most people. But, if $60/year is an issue (and it was hard for me to justify $60/year for a site that maybe serves 60 pages a year :), keep reading.
    • I should also note that it doesn't appear that Firebase supports IPv6 hosting. At least, their instructions didn't tell me to add IPv6 addresses to my DNS. This is probably a minor thing.
The hosting option that did work for me, after a lot of searching and reading, was: Google App Engine.

Google App Engine has a few things that made it a winner for me:
  • A completely free tier.
    • My personal site is way, way, way under the free tier limits.
  • Runs itself
    • App Engine just keeps on trucking, especially for a simple static site.
  • Custom domains
    • No need to upgrade to a paid tier to get this feature.
  • Custom certificates
    • You need to upload your own certification, but you don't need to upgrade to a paid tier to get this feature.
  • Fine for simple static sites
    • For just a few pages, App Engine's configuration is decent. It's not as simple as Firebase's, but I don't anticipate needing redirects.
  • Can be deployed from a push to GitHub
    • Travis to the rescue! The free Travis CI system can trigger a deploy to App Engine, when you push to GitHub.
  • Support for "naked domains"
    • App Engine can now serve http://example.com. For the longest time, they request a subdomain, but naked domains now work.
  • Supports IPv6
    • Because future.
Google App Engine isn't perfect. If you want to do any redirects, you need to start writing Python. And it's not obvious how to setup App Engine for pure static hosting, nor is App Engine the simplest way to serve a static site (e.g. it's not good at recognizing optional trailing slashes in URL paths), but it can be done.

The next question was: where do I get an inexpensive SSL certificate? I looked around, and there are a lot of options and resellers. I purchased a three-year personal cert from https://ssls.com for a total of $15. That's 1/4 the price of one year of hosting with Firebase. The fact that I found a very affordable SSL cert is what really made App Engine a winner for me.

I assume you know about GitHub, how to get an App Engine account, and how to connect Travis to automate the builds. I know this looks like a lot of steps, but, remember, I'm doing three things here: custom domains, SSL, and automated deploys.

Here's a list of docs and some manual steps that helped me get my personal website setup for custom domain, SSL, and automated deploys from GitHub:
  • Custom domains and SSL for App Engine
    • Helps you link your domain to App Engine and walks you through generating the necessary files for the certificate.
  • Affordable personal SSL certs from ssls.com
    • I purchased a "PositiveSSL" cert with a three-year expiration.
    • Generate a CSR by running openssl req -nodes -newkey rsa:2048 -keyout myserver.key -out server.csrin a temp directory
    • Upload the CSR to your certificate vendor
    • You may need to perform additional verification steps. For example, I had to verify that I owned by domain by uploading a file to a special location on my server that serves my domain name.
  • Generating a service account from the Google Cloud Console
    • You can create a Service Account by going to the Google Cloud Console, go to “APIs & auth” -> “Credentials”, then click “Add Credential” and “Service Account”, finally clicking “JSON” to download the JSON key.
  • Encrypting the JSON key
    • Install the travis command-line utils: sudo gem install travis -v 1.8.0 --no-rdoc --no-ri
    • run: cd your_website_dir
    • run: cp path/to/downloaded/cloud/service_account.json .
    • run: travis login --auto
    • run: travis encrypt-file service_account.json --add
    • run: rm service_account.json
      • DO NOT check this file in! Only check in the encrypted version.
  • My .travis.yml filewhich kicks off the deploy script
    • Grab this, and add it to your project (or, diff it with your existing .travis.yml file and add the relevant lines).
  • My travis.sh script which is the actual deploy script
    • This has the logic to download the Google Cloud SDK, configuration authentication, and perform the actual deploy.
  • My app.yaml which configures my app for App Engine
    • I had to remove some values, in order to work with the new gcloud command. For example, I had to remove the applicationand versionkeys from this file (they are set in travis.sh now, via gcloud).
  • Turning on App Engine Admin API and Google Cloud Storage JSON API in the "API Manager" of the Google Cloud Console.
    • None of the docs I found told me to do this. Took me a while to figure this part out!

If you don't mind spending $60/year for hosting with a custom domain and SSL, consider Firebase. It's significantly less steps.
Read the whole story
rogulogu
3254 days ago
reply
Hamburg
Share this story
Delete
Next Page of Stories