I was gonna say… you would first have to tell me what clippy was really meant to help with back in the day besides saying, here’s a link to the help page.
I mean, the idea wasn’t terrible, it just wasn’t executed well.
It was supposed to provide a non-threatening way to help users access functionality of their device or software that they may have been unaware of that would be relevant to their current task. This would assist users in accomplishing their task more efficiently, and help Microsoft by increasing consumers perception of the value their software provides, which reduces their likelihood to want to use something else in the future.
A modern, potentially useful clippy would ideally be able to tell…
what you were actually doing
if you appeared to be struggling or doing something repetitively
if it has the ability to help
…Before it tries to interact with you.
Beyond that, it should be able to link to the tool in question in a way that automatically sets it up to do what you’re trying to do so using it doesn’t set you back from where you were, or just offer to do it for you in a way that doesn’t trash your work if you hate the output.
It’s still probably gonna suck ass and not be helpful, but at least it wouldn’t by vaguely mystifying why it even existed.
The best “digital assistants” I’ve seen recently are ones that actually acknowledge that these are language tools, not “knowledge” or “reasoning” tools.
They can legitimately do a good job figuring out a good response to what you ask it, ignoring the accuracy question. So if you set it up to know how to format data and what data you have available, you can get it to respond to questions like “are there trends in the monthly sales statistics for the past three years?” with a graph of those statistics broken down by product, rather than trying to let a language tool try to do reasoning on numerical data.
Talking good can sound like reasoning because right now things that talk good are usually humans that have basic reasoning skills. It’s why it so confusing when they just happily spout irrational nonsense: we’re used to rationality being a given in things that are confident and articulate.
“works”
I was gonna say… you would first have to tell me what clippy was really meant to help with back in the day besides saying, here’s a link to the help page.
I mean, the idea wasn’t terrible, it just wasn’t executed well.
It was supposed to provide a non-threatening way to help users access functionality of their device or software that they may have been unaware of that would be relevant to their current task. This would assist users in accomplishing their task more efficiently, and help Microsoft by increasing consumers perception of the value their software provides, which reduces their likelihood to want to use something else in the future.
A modern, potentially useful clippy would ideally be able to tell…
Beyond that, it should be able to link to the tool in question in a way that automatically sets it up to do what you’re trying to do so using it doesn’t set you back from where you were, or just offer to do it for you in a way that doesn’t trash your work if you hate the output.
It’s still probably gonna suck ass and not be helpful, but at least it wouldn’t by vaguely mystifying why it even existed.
The best “digital assistants” I’ve seen recently are ones that actually acknowledge that these are language tools, not “knowledge” or “reasoning” tools.
They can legitimately do a good job figuring out a good response to what you ask it, ignoring the accuracy question. So if you set it up to know how to format data and what data you have available, you can get it to respond to questions like “are there trends in the monthly sales statistics for the past three years?” with a graph of those statistics broken down by product, rather than trying to let a language tool try to do reasoning on numerical data.
Talking good can sound like reasoning because right now things that talk good are usually humans that have basic reasoning skills. It’s why it so confusing when they just happily spout irrational nonsense: we’re used to rationality being a given in things that are confident and articulate.