Guide "beginner friendly" pour WCAG.
Dans ce qui semble être l'équivalent du FALC (Facile À Lire et à Comprendre).
The Sustainable Web Interest Group published today a first public Draft Note of Web Sustainability Guidelines (WSG).
The digital industry is responsible for 2-5% of global emissions, more than the aviation industry. The Web Sustainability Guidelines (WSG) cover a wide range of recommendations to make web products and services more sustainable.
These guidelines use planetary, people, and prosperity principles (the PPP approach) throughout the decision-making process, allowing users to minimize their environmental impact in various ways. These include user-centered design, performant web development, carbon-free infrastructure, sustainable business strategy, and, supported by measurability data, various combinations thereof.
…
Software teams often reach for Kubernetes or similar prepackaged answers as default solutions to complex problems. But Kubernetes isn’t a strategy—it’s a tool. Using it prematurely can bury your team in unnecessary complexity and unwanted consequences. These ‘default’ answers reflect a deeper issue: we don’t understand the problem we're solving.
Through real-world examples, we’ll discuss how to think critically about the way decisions are being made in your company. We’ll introduce concepts like participation theater—when people perform the rituals of decision-making without making real decisions—alongside problem restatement as a tool to uncover the real challenge at hand. We’ll also examine different types of decisions (reactive vs. proactive, reversible vs. irreversible) and why recognizing them early changes how you should approach them.
This talk is a call to slow down to speed up your decision-making. Whether you're an engineer, architect, or tech lead, this session will challenge you to pause before reaching for Kubernetes (or other technologies) and instead ask: what problem am I really trying to solve?
About Gien
Gien Verschatse is an experienced consultant and software engineer that specialises in domain modelling and software architecture. She has experience in many domains such as the biotech industry, where she
specialised in DNA building. She's fluent in both object-oriented and functional programming, mostly in .NET. As a Domain-Driven Design practitioner, she always looks to bridge the gaps between experts, users, and engineers.Gien is studying Computer Science at the OU in the Netherlands. As a side interest, she's researching the science of decision-making strategies, to help teams improve how they make technical and organisational decisions. She shares her knowledge by speaking at international conferences.And when she is not doing all that, you'll find her on the sofa, reading a book and sipping coffee.
Attempting to parse HTML with regular expressions is an infamous pitfall, and a great example of using the wrong tool for the job. It's generally accepted to be a bad idea, for a multitude of reasons.
There's this famous Stack Overflow answer about why you should never, ever do it. In fact, this answer got so popular that it was used like a copypasta in some circles. Every time I stumbled upon it, I would think how there's a lot of truth in it - but at the same time, I couldn't agree in full...
But... can't you, really?
It’s commonly understood that automated accessibility testing tools can find around 20-30% of accessibility issues, but what does this actually mean in practice? Beau takes a closer look at these tools using specific examples, demonstrating the types of accessibility issues automated testing tools won't find, why that is the case, and how they could be giving you a false sense of accessibility.
About the speaker
Beau is the Technical Lead for TTC Global's Digital Accessibility Practice, and formerly the Lead Auditing and Testing consultant for the digital accessibility team at Vision Australia. Beau has been working in digital for 20 years, as an accessibility consultant since 2019, and formerly as a Web Developer and Senior/Lead Front End Developer for digital agencies in Melbourne.
Disclaimer: This post is not about lighthouse, other testing tools perform similarly. It's about us developers and our responsibility to not thoughtlessly rely on automatic testing.
It’s always nice to see when people post their Lighthouse scores on social media to highlight how well they’ve optimised their own or their client's website. It shows that they care about the quality of what they build.
Trisha Gee
Observability as the key to performance tuning Software Delivery
The golden rule of application performance tuning: measure, don’t guess. Yet when it comes to developer productivity, too many teams still guess. Builds are slow, tests are flaky, CI feels overloaded—and the default response is to throw hardware at the problem or hope it goes away.
In this talk, we’ll apply the performance engineering mindset to developer experience, showing how observability data from Develocity can profile builds and tests just like applications. By measuring and optimizing build and test performance, teams directly improve the DORA metrics that matter: shorter lead time for changes, lower change failure rates, faster recovery, and higher deployment frequency.
Developer productivity is a performance problem. If you want faster delivery and happier developers, the path is the same as for applications in production: measure first, then optimize.
Achieve Efficient Maintainable and Simple Java Exception handling killing those anti-patterns.
Todo : il y a des bonnes citations à relever
What happens if we can't make another CPU...ever?
What fails first? How long would datacenters last? Does the Internet start to fracture?
Of course, it's a hypothetical thought experiment. But it's interesting to think what chips will stand the test of time, and which might fail sooner than you think!
Here are the other channels I mentioned, take a look:
@jeriellsworth made microchips at home, and is an excellent engineer + teacher, go check her out!
/ @jeriellsworth
@KazeN64 is doing wild optimizations with N64 hardware:
/ @kazen64
https://m.youtube.com/watch?v=S1lKWW7K5r4&pp=ugUHEgVlbi1HQg%3D%3D
You're too intelligent to take action (Why overthinkers can't execute)
You’re not lazy.
You’re not unmotivated.
You’re just stuck in strategy mode, and your intelligence might be making it worse.
In this video, I’ll show you:
– Why overthinkers struggle to act
– The hidden skillset most people never train
– What schools, courses, and work environments got wrong
– And how professional athletes train the exact skill you’re missing
If you’re tired of planning and want to finally execute, this is for you.
0:00 - Why smart people struggle to act
0:41 - The trap of intelligence and strategy
1:36 - What happens during execution
2:57 - My personal story of getting stuck
3:50 - Meet the Strategist and Performer
5:00 - The skill gap that blocks progress
6:07 - The athlete mindset we need to steal
7:15 - Why your plans always collapse
9:10 - The emotional cost of taking action
10:20 - Fake actions and avoidance traps
11:45 - Why you're stuck in the Strategist Loop
13:00 - What the Performer really experiences
14:00 - The hidden performance state
15:00 - Create plans for real-world execution
16:00 - The exposure vs overwhelm curve
17:30 - Train your Performer like a firefighter
18:20 - How misalignment ruins your progress
19:40 - Building a healthy internal partnership
21:00 - What your Performer really needs
22:00 - The power of simple rules
23:30 - Coaching example: Helen's first post
25:30 - Final thoughts + free assessment link
The AI Bubble Is About to Pop
Sam Altman, the CEO of OpenAI, recently admitted that the AI boom might actually be a bubble — just like the Dot-com crash when trillions vanished almost overnight. Today, the world is pouring trillions into chips, data centers, and hype-fueled AI startups… but what happens when the returns don’t show up?
In this video, we break down exactly how the AI bubble formed — and what could happen when it bursts:
⏱ Chapters
0:00 Sam Altman’s Bubble Warning
0:52 Stocks Run on Vibes
1:47 Trillions Burned on Chips
2:49 The Energy Wall
3:58 The Tech Is Brittle
5:20 The Psychology Trap
6:37 The Venture Bubble Mechanics
7:23 What Survives After the Pop
8:00 Free Budget Tracker(Template)
From trillion-dollar spending and fragile tech to rising energy costs and companies chasing hype over results, this is the real story of the AI bubble — and what might be left standing after it pops.
👉 Do you think AI is the future, or is it already a bubble waiting to burst? Drop your thoughts in the comments.
If you enjoy deep dives into money, business, tech, and the economy, make sure to subscribe for more videos like this.
John Oliver discusses ABC’s move to pull Jimmy Kimmel off the air, what it has to do with Brendan Carr and the FCC, what it means for free speech in the United States, and which broadcasting giant should open an Italian restaurant. Ok fine: it’s Tegna. With a name like “Tegna” you’ve just gotta serve complimentary garlic knots. End of discussion.
Inclusive Components
A blog trying to be a pattern library.
All about designing inclusive web interfaces, piece by piece.
- Cards
- Data tables
- Notifications
- Content sliders
Learn to write better, resilient CSS
If you find yourself wrestling with CSS layout, it’s likely you’re making decisions for browsers they should be making themselves.
Through a series of simple, composable layouts, Every Layout will teach you how to better harness the built-in algorithms that power browsers and CSS.
In this video I analyze the DOOM project by id Software.
Comment, like, subscribe, let’s trigger the algo!
The focus will be on software architecture, technical limitations, technical concepts.
If you like the content comment, like and subscribe to support the channel!
- 00:00 DOOM
- 03:18 Software Architecture
- 08:06 Build Process
- 10:04 Component Diagram
- 11:07 WAD Files
- 13:20 Main Loop
- 14:35 2D Renderer
- 20:45 3D Renderer Intro
- 24:22 Binary Space Partitioning
- 27:45 BSP Example
- 31:03 Player FOV
- 35:22 Wall Clipping
- 40:30 Visplanes
- 41:25 "Masked"
- 42:21 Conclusion and Lessons
Cleanup, Speedup, Levelup.
One package at a time.
e18e (Ecosystem Performance) is an initiative to connect the folks and projects working to improve the performance of JS packages.
We'd also like to provide visibility to the efforts of countless open source developers working to cleanup, levelup, and speedup our dependencies.
We invite you to get involved in the different projects linked from these pages, and to connect with other like-minded folks.
As part of the community e18e effort, this project provides a collection of module replacements (i.e. possible alternative packages).
We provide two things:
Manifests (mappings of modules to their possible replacements)
Documentation for more complex replacements
List of JavaScript methods which you can use natively + ESLint Plugin