Hacker Newsnew | past | comments | ask | show | jobs | submit | simongray's commentslogin

NemID, the previous national 2-factor solution, used a small card with rows of pre-printed single-use codes. When you logged in to a bank or a public sector website, it would ask for a random code at a specific row and column number. Once the system registered that you had just a handful of codes left, a new card would be sent to you via snailmail. It worked fine for the time.

The current system, MitID, depends on smartphones, though you can get an an external key generator as a backup too.


The big drawback of one time passwords is that it doesn't protect against man-in-the-middle attacks such as phishing, which is in practice one of the most common attacks on systems of this scale.

The logistics operation involved in distributing codes is also very expensive and inflexible. You may need to authenticate payments a dozen times in an hour one day, when you are on a farmers market which doesn't take card payments or you are out dining with friends, and another day not at all.

Given all this, a good old public key infrastructure makes sense. But that is unfortunately also usually the first step to a complexity explosion.


> The big drawback of one time passwords is that it doesn't protect against man-in-the-middle attacks such as phishing, which is in practice one of the most common attacks on systems of this scale.

This is true and was definitely a criticism of the old system, where websites would open the NemID iframe and ask you for your username, password and a specific indexed OTP code, without providing any authentication to you. You only notice something weird if it asks you for an the index of a code that is not on your card but maybe the scammer is lucky and guesses an index that you have and then they can use that phished username/password/OTP triple to perform an unauthorized action.

The new system is slightly different, because if you use the mobile phone authentication it will send you a notification to your phone, but if you use the (bespoke, non-standard) OTP dongle it still does not authenticate itself towards the user. However the codes are now time-based so if they collect an OTP code they can only use it in a ~30s window, so the phished credentials have to be used immediately.


> You may need to authenticate payments a dozen times in an hour one day, when you are on a farmers market which doesn't take card payments or you are out dining with friends, and another day not at all.

It's very unlikely people would need to mess about with MittId/BankID if they can't use card payments at a market. Firstly, if they're doing the almost-unheard-of clunky approach of using their mobile banking app to make a bank transfer, it would probably be authorised using their touch/face ID instead of BankID/MittID. But far more likely, they'd use one of the ubiquitous mobile payment apps: Vipps (Norway), Swish (Sweden) or MobilePay (Denmark).


> The logistics operation involved in distributing codes is also very expensive and inflexible. You may need to authenticate payments a dozen times in an hour one day, when you are on a farmers market which doesn't take card payments or you are out dining with friends, and another day not at all.

Neither of the scenarios you describe would require you to authenticate using MitID: Peer-to-peer payments in Denmark are typically done using the app MobilePay, which only requires MitID authentication during setup. And you never need MitID authentication when paying in person, at most you'll need your card's pin-code


Yeah but functionally it is the same. If the website is down it doesn't matter if I got the OTP code from a piece of paper or the dongle.


> And what's the actual result? Just look at the market capitalization of European companies compared to US companies..

Europe is actually doing quite well at the moment. The European stock markets have over-performed quite decently vs. the US ever since Trump became president, despite the various curveballs thrown at Europe in recent years. Market capitalisation in the US is held up primarily by the Magnificent 7 who are great outliers in the American stock market.


There is a recency bias here. The Sp500 has outperformed the Stoxx600 every year for the last 5 years, except 2022.

Cumulative returns are around 100% for the american index, vs 60% for the EU one.


Maybe the momentum of Stoxx600 will last the next 4 years? Or maybe the S&P500 will come crashing down soon? Who knows.

The Shiller PE ratio is insanely high. At least the European market isn't completely overinvested in just 7 companies who are spending a lot of their money on the exact same thing, so it has that going for it.


Eu Inc. will become a reality soon, so it isn't like the Commission is standing still: eu.inc/what-is-eu-inc


EU inc is worthless without alignment on a single capital market for fundraising and ultimately going public, sane/interoperable labor laws for hiring, and a single language market over the long term.

The last piece is extremely important. Being able to raise money and hire across the EU with no friction would be fantastic, but it means nothing if actually selling into different EU markets has massive language barriers (average people in many neighboring EU countries cannot communicate with each other beyond the level of a 4 year old). English fluency is massively overstated by people who only have visited European tourist capitals.


It's a crucial step on the way. Definitely nothing to scoff at.


Political tidal forces in Europe have, for quite some time, pointed more toward fragmentation than toward strengthening common structures. What makes this particularly ironic is that this impulse is often strongest among the same voices that most loudly lament Europe’s failure to build globally competitive industries—software foremost among them.

That tension has always struck me as deeply paradoxical. In the post-Brexit era, we have had a very visible case study in what happens when shared European frameworks are removed. The UK has spent years scrambling to recreate institutions, regulatory mechanisms, and coordination structures that had previously been provided at the EU level. One might expect that experience to have clarified the value of those structures. It largely hasn’t.

A significant part of the problem is deep lack of understanding. "EU bureaucracy" is a common target of criticism, yet it is remarkably rare for critics to have any concrete sense of what that bureaucracy actually does. The EU tends to appear in public discourse only when politicians argue, or when a regulation is framed as an intrusion on national sovereignty.

The everyday, unglamorous work of harmonization, reducing friction, enabling cross-border activity, and making markets function at scale—remains almost entirely invisible.

This creates a structural communication failure. The benefits of integration are mostly preventative and cumulative: things that don’t break, costs that don’t arise, barriers that quietly disappear. These effects are hard to convey through headlines or sound bites. Dry institutional reports are a poor match for a public sphere with limited patience for complexity. The result is a persistent undervaluation of the very mechanisms that make large, integrated markets possible.

Language barriers are often invoked in these discussions, and while they are real, their relevance is frequently overstated in this context. In white-collar professions, English proficiency is generally passable to good. This is especially true in software engineering, where English is effectively the working language of the field.

That said, proficiency is often domain-specific: people may read and write technical English fluently while still struggling with more active uses such as negotiation, persuasion, or conflict resolution.

In typical blue collar-type professions, by contrast, language barriers are substantial and unavoidable.

Where the problem becomes genuinely self-defeating is in the insistence that using English as a shared working language represents some form of cultural submission or imperialism. This view, rooted more in nationalist romanticism than in economic reality, adds pointless friction. It is beyond stupid to waste resources publishing official documents in 24 different languages. But eliminating this waste is a hard sell when you ask the muggles.

It brings us back to the central contradiction: the same people who regret Europe’s inability to produce globally dominant software companies often support attitudes and policies that fragment markets, raise transaction costs, and make such outcomes far less likely.

Europe cannot simultaneously expect to realize the benefits of scale and reject the mechanisms that make scale possible.


Trump and Putin are giving a golden opportunity to revive European integration. Alas, nationalistic populism with a badly hidden sympathy for the US (on the right) and Russia (on the left) seems to catch more votes these days.


Definitely agree, this is the classic left/right contradiction that has always existed.

In the past, center-left and center-right coalitions were able to find win-win compromises out of this contradiction. But now that everyone has moved outward on the political spectrum and gone populist on both sides, it's a stalemate.

The pro-central planning folks are now anti-business and anti-growth since private capital represents a threat to their utopian authoritarian dreams (this truth will be masked with religious appeals to the poor and the environment of course).

The pro-business, pro-growth folks are conversely anti-central planning, since government represents a threat to their utopian libertarian dreams (central planners might kill the unfair arbitrage opportunities they've found, and central planners tend to overspend and expect the private sector to pay for it).

While central planners are terrible capital allocators, strong central planning is the only way to create well functioning markets. For example, the US Federal government wields total control over US state governments in basically everything.

What Europe needs is a center coalition of pro-business and pro-government wonks (basically what the neocons were), but the phrase 'neocon' has become a bizarre internet meme for conspiracy theorists and there exists very little interest in moderate viewpoints these days.

I'm guessing we'll all be dead before any of these issues are solved in Europe (if ever), absent a full-scale Russian or Chinese invasion forcing the EU to integrate.


> What's the saying? Time in the market vs timing the market.

Seems like he managed both.


> everyone but the biggest players throwing out a lot of bathwater with very little baby by simply not accepting Danish users (if required).

The biggest players in social media are precisely the ones that this law is targeting.

No one in charge of implementing this law is going to care whether some Mastodon server implements a special auth solution for Danish users or not, they are going to care that Facebook, TikTok, Instagram, etc. do so.


> No one in charge of implementing this law is going to care whether some Mastodon server implements a special auth solution for Danish users or not, they are going to care that Facebook, TikTok, Instagram, etc. do so.

And if that little Mastodon server ends up hosting some content that is embarrassing or offensive to the Danish authorities, laws like this will surely not be used to retaliate...

Arbitrarily and selectively enforced laws seem like an obviously bad thing to me. If the government can nail me for anything, even if they practically don't, I'll be very wary of offending or embarrassing the government.


Why do you think it's going to be arbitrary?

The law will obviously be framed in such a way as to hit the targets it is supposed to hit, avoid collateral damage. It's not like complete amateurs are writing our laws.


That it's going to be arbitrary is your own assessment. You said that "No one in charge of implementing this law is going to care whether some Mastodon server implements a special auth solution for Danish users or not, they are going to care that Facebook, TikTok, Instagram, etc. do so."

I responded by explaining why that wouldn't be a good thing.

Have you changed your mind on that point or are you simply not keeping track of your argument? Either way there can't be an honest discussion whether you have the memory of a goldfish or are deliberately ignoring what you've said.


I am talking about the purpose of the law and the way it is written. It's not hard to create a law that only targets the bigger services, just make it apply to entities with a daily user count above N. A law isn't a headline on Hacker News, it's a carefully written document.


> I am talking about the purpose of the law and the way it is written. It's not hard to create a law that only targets the bigger services, just make it apply to entities with a daily user count above N.

Why would they, though, if "no one in charge of implementing this law is going to care whether some Mastodon server implements a special auth solution for Danish users or not"? The EU CSAR proposal (which Denmark seem very much on board with) doesn't make such exceptions, so why should this law?

> A law isn't a headline on Hacker News, it's a carefully written document.

This is a non sequitur, and also pure speculation.


> in exchange for a promise to use the revenues "for good purposes".

They still do an enormous amount of charity, though activities of the foundation are probably highly localised to Denmark: https://en.wikipedia.org/wiki/Novo_Nordisk_Foundation


The behaviour of an HTML datalist is basically completely different in every browser. It is a highly flawed element.


> How does programming with Clojure targeting multiple platforms (JVM, JS, CLR, LLVM, ...) work?

Each variant has its own file extension, e.g. .clj for JVM and .cljs for JS.

In case you're writing code that needs to work on multiple platforms, you put it in a .cljc file. Any of the code in these files that still needs to be different due to the platform choice is differentiated inline using a reader macro, which results in the different platform compilers getting a (slightly) different abstract syntax tree, so it is not too dissimilar from writing cross-platform code in other languages (just more convenient due to the Lisp style).


Having to rate the 30 examples made me realise just how much HN is dominated by LLM content these days. Kinda sad.


I genuinely find that interesting to hear. what about the 30 examples felt different to say the frontpage? (assuming it did)

On a meta level i was suuuuper conscious of writing every word of this post/comments myself, as my prior is that HN's community is very intollerant of and highly sensitive to low effort content, whether via AI or not. This is despite using AI tools for lots of other parts of work (drafting, coding, summarising, brainstorming etc).

Do you think HN has become more accepting of AI slop, the slop is becoming harder to detect, or isnt as discerning as i assume?


I'm not talking about the content but about the topics.


I mean... 20% is not really a lot. It's probably a lot closer to 100% in most countries of the world.


Is it? That doesn’t sound right.


Depends. English first language countries remain mostly monolingual. But the rest divides into:

- educated people are expected to learn English in school and end up consuming English media anyway (where you'd expect >50% multilingual, but not everyone)

- country has many official languages (many people are multilingual, but not necessarily in English; e.g. India, Indonesia, possibly China)

- country has literacy problems (not so many left now, maybe in sub-Saharan Africa)

- proud monoglots of a language that isn't English: Japan, France (but even here a lot of people consume English media anyway)


> - country has many official languages

Belgium has 3 languages but my guess would be that each region speaks English better. The French, pardon, Walloniers scarcely speak Dutch and while the Flemish area speaks better French it's usually not great (unsure whether most people would qualify as fluent). Afaik Flanders has mandatory French in school but Wallonia doesn't need to take Dutch, even though 60% of the population is Dutch-speaking. The German-speaking region is mostly forgotten about and they either integrate with the French-speaking part or work in Germany with Belgium as a cheap place to live

The Netherlands has Papiamento as the native language of most people in a part of the country. They're overseas but they vote for the same government and live by the same law. I literally didn't know this until a few years ago (I'm 30). I assume they don't want independence due to things like getting defence and other benefits from a much larger economy (and we're right to feel the need to pay such repairs) but man, this feels really 1800s slave trade levels of wrong. Not a soul speaks Papiamento in the european Netherlands, it's not even an option in school — let alone compulsory!

In Luxembourg it's hit or miss whether someone speaks the national language (Luxembourgish), French (an administrative language), or German (another administrative language). Many will speak at least two, but many also only one (French in particular)

Very eurocentric perhaps but that's my experience with countries that have more than one official language: nearly nobody bothers learning the other if there is no direct necessity


90 % of Norwegians speak English according to a quick search I just did. 89 % in Sweden.


That’s not “most countries”.


Maybe you can argue why it's not most countries? It seems obvious to me that it is, but I also come from a country where everyone is bilingual.

Many former European colonies are mostly bilingual, e.g. Africa is highly multilingual out of necessity. Much of Europe itself is also mostly bilingual. If you want to communicate outside your own little region and your native language isn't a lingua franca, you need to be bilingual in this world.

The main holdouts when it comes to bilingualism are former imperial powers who managed to both kill domestic language diversity (e.g. France, UK, Russia) while also spreading their national language as a lingua Franca. Another group of holdouts are settler colonies such as the US, which didn't have a dominant native population after the arrival of Europeans.

But even if e.g. Russia itself isn't super bilingual, the rest of the former Soviet Union certainly is, since that is just the reality if you live in a small and/or formerly colonised country.


Contribute more data points of your own then, or even just one. Maybe eventually we'll get to know whether it's most or not rather than dismissing someone who's helping


I believe the more damning thing is, there are more multilingual english speakers than monoglots, merely by virtue of ESL being more common than Native English


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: