There seems to be documentation issue with Microsoft Defender ATP for Linux. The system requirements, as far as I can see, do not mention that systemd is needed. I found this on a Debian 9 (Stretch) system that was configured with SysV init. The post-install script of
mdatp performs some tests that use the
systemctl command, which is of course missing without systemd.
As part of a blog post about the new v14 of Chef Infra Server, it was announced that from now on existing functionality will be deprecated in favor of the cloud version. It will be interesting to see how this works out. Personally, I have never been a friend of forcing customers off an existing product. It is a dangerous move that bears the risk of customers switching the vendor entirely. Especially so, if it comes with a major architectural shift like from on-premise to cloud.
I have been a happy user of Chef Server for about five years now, although only for a very small number of machines (single digit). The decision for Chef had been made at a time when Ansible was still in its early stages. But with this latest development I will need to move away from Chef. It is pity, because I really like the tool and have done various custom extensions.
How to approach the uncertainty of our future in the field of education.
This question comes up at time code 8:45 in the video and I found the response quite intriguing. But overall this video is more about the impact of COVID-19 on teaching and I highly recommend watching it.
As per “Google Cloud Application Modernization Program: Get to the future faster” (citing DevOps Research and Assessment) “teams that ship code numerous times per day are 1.53 times more likely to achieve or exceed their commercial goals, including profitability, and market share.” What many people will make out of this is that it should be sufficient to increase the release rate to be successful.
A similar study (I cannot remember the source right now) shows that people who use Firefox as their web browser have a better career. And I guess there are many more comparable “findings” that you can come across. Unfortunately they are somewhere between misleading and completely wrong.
The problem is that such statements often present two things as cause and effect. But in reality those two things are “only” correlated. So both the high deployment rate and commercial success are effects of the same cause. And that cause is that these teams have experienced people who really know what they are doing.
I had recently installed Linux in a dual-boot setup on a test machine (an old Lenovo Thinkpad T430). What proved more difficult than in former times was to restore the original state. Most of the recommendations I found online were less than helpful. In particular, many of them ignored the fact that there are two entirely different approaches out there to handle the boot: UEFI and legacy or GPT and MBR respectively. My machine was using MBR (Master Boot Record), given its age.
What finally solved the issue was the following command:
C:\> bootsect /nt60 c: /mbr
I used a USB stick with Windows 10 Installer, but since then learned that you can get to the “repair” console easier, if your Windows 10 still starts. All you need to do is perform the following steps:
- Log off.
- When the login screen appears, press a key so that the password field shows up. This will also enable the “power” button in the lower right corner of the screen.
- Press and hold shift
- Left-click the power and choose “Restart”
- Let go of the shift key and the repair menu appears.
- Go to
Maths can be interesting …
When a group of people is assigned a task they never had to deal with before, they often start with brainstorming. So you have a number of folks who typically are not exactly knowledgeable about something, but at the same time try to agree on an approach for dealing with it. So they start with a discussion on what to do, who should be assigned what activity etc.
The problem is, that this is not brainstorming. It is group of people talking about something that they do not know much about. So a lot of assumptions are made, often not even consciously. People will simply extrapolate their past experiences into the new topic. But this approach does not deliver particularly good results.
The relatively obvious problem is inefficiency. Instead of the whole group talking 15 minutes about something, prior research by just a single person would have produced at least the same result. (Well, probably a better one.) The bigger problem, though, is about effectiveness. In other words: The result of the group discussion will usually not be a truly good solution. Again for lack of research and knowledge.
From a team dynamics perspective there is another problem. There are people who tend to dominate such free-floating discussions. Apart from personality, the folks talking most are usually those that know least about the subject. Because those that understand things at least partially, know that it is not so easy. But there is no time for careful deliberation during such a meeting. So decisions are made based on the least helpful content and the people who know best go out of the meeting frustrated.
Luckily, it is very easy to overcome this. Just task people to do some research on their own before the meeting. You will have a much more thoughtful discussion.
For the fans of Uncle Bob (aka Robert C. Martin) here is another interesting video.