Glad to learn that HTTP/0.9 is still "in use globally" then. A bit surprising, but since it's all about stretching definitions past what is reasonable, for the sole purpose of having the last word, let's shoehorn anything into anything to the infinity and beyond!!! 🤡🚀
I've heard similar from the worst first year CS students you could ever meet. People talk out their ass without the experience to back up their observations constantly. The indentation thing is a reasonable heuristic that states you are adding too much complexity at specific points in your code that suggests you should isolate core pieces of logic into discrete functions. And while that's broadly reasonable, this often has the downside of you producing code that has a lot of very small, very specific functions that are only ever invoked by other very small, very specific functions. It doesn't make your code easier to read or understand and it arguably leads to scenarios in which your code becomes very disorganized and needlessly opaque purely because you didn't want additional indentation in order to meet some kind of arbitrary formatting guideline you set for yourself. This is something that happens in any language but some languages are more susceptible to it than others. PEP8's line length limit is treated like biblical edict by your more insufferable python developers.
A) Yes. Large companies have entire departments dedicated to QA, and it's best not to leave QA to devs, if you can afford it. Dunno what you mean by "still," since the job never went away.
Dunno what to tell you. I do QA for a living. I see postings all the time for QA positions in other companies, and my company has had QA for at least two decades, with the department expanding over the last three years.
I'm not claiming it's ubiquitous, but maybe you're just out of the loop.
In Agile, QA testing should be involved throughout the whole development process, with QA not just following the development, but supporting it. QA testing should be implemented early and continuously, with constant feedback to developers to ensure that any issues are fixed quickly.
Which, when put to practice, means QAs become BAs, no comprehensive QA occurs, and when the code is shit because they have no actual QA support and the scope changes constantly with no firm documented requirements, the dev gets fired.
Great model for people who like to sit in meetings and complain.
When did you retire? Agile has been around for at least 20 years, more like 30 if you count scrum being introduced before agile was formally defined. No matter how critical I am of agile it is hardly a fad at this point
I stopped being able to find QA work in the early 2010s or so. Converted to BI Developer. Have not encountered a dedicated QA at any of the small assortment of jobs I have had since.
Edit: And fair, despite it being a waste of time cult mentality engineered to make developers suffer and enshitify software quality, Agile got enough Kool aid drinkers to qualify it as more than a fad.
I work at a company whose entire business model is providing QA to other companies. I work directly with some very large, public companies, and some smaller ones. Almost all of them have some form of dedicated in-house QA, which we supplement.
Agile was definitelly taken in with the same irrationality as fashion at some point.
It's probably the best software development process philosophy for certain environments (for example: were there are fast changing requirements and easy access to end users) whilst being pretty shit for others (good luck trying to fit it at a proceess level when some software development is outsourced to independent teams or using for high performance systems design) and it eventually mostly came out of that fad period being used more for the right things (even if, often, less that properly) and less for the wrong things.
That said the Agile as fad phase was over a decade ago.
Don't take this badly but it sounds like you've only seen a tiny slice of the software development done out there and had some really bad experiences with Agile in it.
It's perfectly understandable: there are probably more bad uses of Agile out there than good ones and certain areas of software development tend to be dominated by environments which are big bloody "amateur hour every hour of the day, every day of the year" messes, Agile or no Agile.
That does however not mean that your experience stands for the entirety of what's out there trumphing even the experience of other people who also work in QA in environments where Agile is used.
Agile made Management, who had actual Senior Designer-Developers and Technical Architects designing and adjusting actual development processes, think that they had this silver bullet software development recipe that worked for everything so they didn't need those more senior (read more expensive and unwilling to accpet the same level of exploitation as the more junior types) people anymore.
It also drove the part of the Tech Industry that relies mainly on young and inexperienced techies and management (*cough* Startups *cough*) to think they didn't need experienced techies.
As usual it turned out that "there are no silver bullets", things are more complex, Agile doesn't work well for everything and various individual practices of it only make sense in some cases (and in some are even required for the rest to work properly) whilst in others are massive wasting of time (and in some cases, the usefull-wasteful balance depends on frequency and timing), plus in some situations (outsourced development) they're extremelly hard or even impossible to pull at a project scope.
That said, I bet that what you think is "The Industry" is mainly Tech companies in the US rather than were most software development occurs: large non-Tech companies with with a high dependency of software for competitive advantage - such as Banks - and hence more than enough specific software requirements to hire vast software development departments to in-house develop custom solutions for their specific needs.
Big companies whose success depends on their core business-side employees doing their work properly care a lot more about software not breaking or even delaying their business processes (and hence hire QA to figure out those problems in new software before it even gets to the business users) than Tech companies providing software to non-paying retail users who aren't even their customers (the customers are the advertisers they sell access to the eyeballs of those users) and hence will shovel just about anything out and hopefully sort out the bugs and lousy UX/UI design through A/B testing and user bug-reports.
QA is also known as preventing shit from exploding and losing us millions of dollars in the process, or better yet, cybersec. Cybersec is just glorified QA
I worked for an actual QA department that produced actual documentation and ran actual full scale QA cycles.
In the past 15 years, I have seen that practice all but fully disappear and be replaced by people who click at things until they find 1 thing, have a verbal meeting vaguely describing it, and repeat 2 to 3 times a day.
IMO, that isn't QA. It's being lazy, illiterate, and whiny while making the dev do ALL of the actual work.
When I departed QA myself, it was in the onset of automation.
In return, when the QA jobs disappeared, I learned basic scripting and started automating BI processes.
So, I would say:
I should hope modern QA departments (as I am told they exist) are automated and share both their tests and their results with devs in an efficient manner.
I don't think QA departments really exist today in a substantive way, and if they do, it isnt in as cooperative of a fashion as described in 1.
I still have observed a world where QA went bye bye. Planning? Drafting a Scope of Work? Doing a proper analysis of the solution you are seeking, fleshing it out, and setting a comprehensive list of firm requirements that define delivery of said solution? Offering the resources to test the deliverable against the well documented and established requirements to give the all clear before the solution is delivered?
Doesn't exist anymore, and modern "QA" is being the lemming who sits in meetings as listens to the management, then schedules meetings to sit and complain at the Dev about how they aren't "hitting the mark" (Because it was about 4 feet directly in front of them when they published, and is now at 5 erratically placed spaces behind them).
I think it's probably because we've shifted away from shipping software as a product, and onto software as a service. I.E. in the 90s if win 95 irreversibly corrupted, that would be devastating to sales.
But today with windows 11? Just roll it out in one of the twenty three testing branches you have and see what happens, and if shit does break. Just work around it. It'll be fine. Even if something does happen, you can most of the time, fix it and roll out a new update.
And i also think it's moved to be more team centric, rather than department centric. A lot of the theory is probably more senior team led type responsibility. While everyone writing the code can chip in and add some as well. Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Also there's a lot more money in shipping shit out the door, than there is in shipping a functional product, unfortunately.
Thank you for your TED talk defining enshitification.
Middle management bloat.
Edit: Bonus points for
Developers knowing how to write secure code helps, so they should theoretically also be capable of QA themselves to a degree.
Which is straight up just saying "why don't the devs just do it themselves? I'm busy with meetings to whine back and forth with other middle management."
And that's precisely why QA still exists and why it shouldn't be the devs. And yet, you'll still wind up with weird situations, despite your best efforts!
Any good software developer is going to account for and even test all the weird situations they can think of ... and not the ones they cannot think of as they're not even aware of those as a possibility (if they were they would account for and test them).
Which is why you want somebody with a different mindset to independently come up with their own situations.
It's not a value judgment on the quality of the developer, it's just accounting for, at a software development process level, the fact that humans are not all knowing, not even devs ;)
and this is an incredibly valuable reason to have a technically simple UI, because it fundamentally limits the amount of stupid shit people can do, without it being the fault of the designer.
Maybe you need better signage. Maybe you need to reverse the direction of the door. Maybe you could automate the door. Or maybe the user is just fucking stupid. 😄
This is very perfectionist. Let me install my doors the way it's comfortable or pleasing. Where I see a knob I'll reach. And where I see a "pull" sign I pull, or get contex clues.
There is research for everything, let's say it's more comfortable to push and the knob is on the right side for me. I could spend way more time and effort than thia desrves to apeal to that study, "I have great UX", I'd tell myself. But then I'd show this product on some eastern market where they read in "reverse" and it'll not be comfortable nor "100% natural" for them. Meaning, I'd fail, my UX'd be horrible for half the planet.
This might be worth for universal things, that are already researched and you don't need to spend years and a kidney to figure out. Like maybe how are "next", "cancel" and "back" buttons are next to each other. But I mean.. just copy the most recent you used.
In Software Development it ultimatelly boils down to "are making software for the end users or are you making it for yourself?"
Because in your example, that's what ultimatelly defines whose "wrong" the developer is supposed to guide him/herself by.
(So yeah, making software for fun or you own personal use is going to follow quite different requirement criteria than making software for use by other people).
Wow. I totally forgot that Commodore BASIC ignores spaces in variable names. I do remember that it ignores anything after the first two letters though. That said, there's a bit more going on here than meets the eye.
PRINT HELLO WORLD is actually parsed as PRINTHELLOWORLD, that is: grab the values of the variables HELLOW (which is actually just HE) and LD, bitwise OR them together and then print.
Since it's very likely both HE and LD were undefined, they were quietly created then initialised to 0 before their bitwise-OR was calculated for the 0 that appeared.
Back in the day, people generally didn't put many spaces in their Commodore BASIC programs because those spaces each took up a byte of valuable memory. That PET2001, if unexpanded, only has 8KB in it.
512KB? At the risk of going all Four Yorkshiremen, that sounds luxurious.
Floppy disks held 170KB if you were lucky to have a drive. The PET line, like many 8-bit computers, used a cassette tape drive (yes, those things that preceded CDs for holding and playing music). Capacity depended on the length of the tape. And it took ages to load.
The PET was fancy because it had a built-in cassette drive. That's what you can see to the left of the keyboard in the picture.
The main machines at work still do upgrades via tapes. The main program can communicate with lots of online services, but it still updates via tape. Probably too hard to spend the time to figure out how to implement OTA upgrades, since it was first created back in the 80s.
But the 512KB was more of a vague gesture towards the limitations back then. We had a separate floppy drive, with which I would load up a big black rectangle that had 1-5 very basic games on it. There's something special about locking down the disk which you can't get even with its smaller successor...
Comparing audio cassettes to modern high-density tape storage is pretty much the same comparison as an 8-bit computer with a modern 64-bit server, or, say, a hamster with a human.
Basically the same thing, but the differences are somewhat notable.
I speak 4 languages, English, Swedish, German an Polish. At work in Swrden our office language was English because so many people from all over the world worked together. I was a consultant at the customers office. There was another consultant from Poland visiting the customer and after a heated meeting he sat down at his desk, which was adjesent to mine and called his collegues in Poland. Ha basically said that those Swedes are so stupid, they want us to use 9 women to give birth to the baby in one month instead of 9 months, without realizing that I could understand everything. I had to work hard to not burst out in loughter.
I actually know someone like this. He's been in software engineering since the early 2000. I recently saw a post from him that he's now a firefighter recruit.
I’ve been in tech since 2005 and I wish I had the means to bail like that. I’ve honestly considered taking a fat pay cut and going back to driving a forklift.
It’s interesting because my dad followed a similar path and I wish I had the smarts he did. He worked as an electrical engineer and was with a company contracted by NASA. He told me how he got to work on some of the stuff in the space capsules back in the 70s/80s. Then at some point he became a full-time kitchen designer and was a carpenter. I asked him once why he left such a high-paying and interesting field. He said it was because all of the people he worked with were uptight squares and he just didn’t like it.
He passed away about 17 years ago. I wish he was still around. I could use his advice as a web dev that feels collectively burnt out and in a rut.
My friend was a pretty accomplished academic. Nothing like a mad genius or anything, but pretty excellent and capable. Wasn’t down with the rat-racey pressure to publish and oversell ideas. Left it all to go live in a small farm town. Last time we talked he seemed happy but it wasn’t the easy and smooth path of academia -> farm town, it was actually academia -> enormous existential crisis -> farm town.
An old employer of mine that I fired was full to the brim of people who genuinely thought that nine women could make a baby in one month.
Like techies we pointed out that nine women could average one baby per month if that what they wanted over nine months but it requires another nine months of planning first.
They didn't get it. Just kept hiring fixed term contractors to "increase velocity".
The worst port of it was when my team was just the small internet hippy department that no one took seriously we never had these problems, then we got promoted to "proper department" and lost everything.
Exactly! You actually CAN have 50 people finish something 50x faster, but it takes a shitload of planning, and that equals time and money no company I have ever worked for, or even known of, would allocate to something that isn't generating immediate income.
Take the Hoover Dam for example: Dsigned over 3 ish years and built in 5, at a time when nothing that huge had ever been made before, at less than a billion in today money, and 2 years ahead of schedule. It's 90 years old.
Programmer Humor
Top
This magazine is not receiving updates (last activity 52 day(s) ago).