Hacker News new | past | comments | ask | show | jobs | submit login

I'm getting really tired of writers crapping on 'AI' as if a static self-sufficient offering.

Like no, the AI doesn't know everything other than what the terrorist is thinking. It summarizes what it's being fed.

If a chatbot was being fed reports concerned about border activities then it's going to raise concern about border activities.

This is an unnecessary and misleading angle to the article jumping on a bandwagon.

The failure here is a broader failure of human intelligence across Western intelligence services in favor of contracts with third party defense contractors. There's a story for that.

For "AI not knowing the terrorist mind" not much of a story.




The issue is, that many non-tech (and I'm starting to think also some tech) people believe that "AI" is an accurate label, and therefore that they can expect these algorithms to be able to think intelligently. The reason that it's called "AI" instead of, say, "large language models" (or whatever algorithm is being used), is precisely to create this impression of that capability, so as to sell the product.

"Using artificial intelligence, the system analyzes behavior, predicts risks, raises alerts..."

No, not very well, it doesn't. And this claim was not at all equivalent to "it summarizes what it's being fed".


> The issue is, that many non-tech (and I'm starting to think also some tech) people believe that "AI" is an accurate label, and therefore that they can expect these algorithms to be able to think intelligently.

Until we can define it, I think we should stop using the term 'intelligent' at all. It misleads people precisely because it means different things in different contexts.

If something can comprehend language, solve word problems, get a really high score on the SATs and LSATs and translate perfectly from any language to any other, we could definitely say it is 'intelligent' in all of those contexts. Is it 'intelligent' in other contexts?

Applying a technology that is really good at many things to things which it is not good at and selling that as a panacea is not a new idea. If we want it to change in this instance, we should start at least defining the terms we use so that we can determine the scope of its relevance to any area. Otherwise people make assumptions to their detriment and we can't agree even on what we are arguing about.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: