Maybe I’m an idiot but how would a base 60 system with “Cleaner fractions means fewer approximations and more accurate maths, and the researchers suggest we can learn from it today.” make any difference when computers are powerful enough to generate solutions to answer with more accuracy than is ever needed in real world applications?
None, in modern context we can work in any base we desire, all that basic stuff got generalized ages ago. No one is going to change computing systems to use babylonian-style. And the trigonometry stuff is the same thing we knew, but discovered earlier than the greeks.
It’s a important discovery for sure, especially for our understanding of ancient Mesopotamian cultures, but everything else is the authors and the article going bananas with conclusions.
That’s kind of what I figured. I wish journalism didn’t need to be so incredibly sensationalist. I understand that it’s because the majority of the populace has the attention span of a gnat but it doesn’t make me feel any less annoyed by it.
Every thing you code is binary. You may write ‘15’, but the code your computer runs will use ‘00001111’. The base-10 source code is only like that for human readability. All mathematical operations are done in binary, and all numbers are stored in binary. The only time they are not is the exact moment they are converted to text to display to the user.
What do you mean algorithms are not in base 2? What else are they in?
Just because you have human readable code which uses base 10 doesn’t mean it isn’t all translated to binary down the line. If that’s what you’re referring to, of course. Under the hood it’s all binary, and always has been.
Okay, walk me through how you think the code an algorithm is written in gets processed by the computer step by step, please. How do you think a computer operates and is programmed?
Let’s say you have code to tell the computer to calculate 3 + 5, in, say, C, because that’s close to assembly. What happens on the technical level?
Then explain to me the fundamentals you are referring to.
Because if you are wondering about how a computer processes information, I can tell you. I can then back it up with sources, and how code gets decoded into assembly, and how assembly is interpreted (spoiler alert, it’s binary, as it’s a direct representation of the digital circuits on the hardware level, which operate based on binary states, on and off). You just have to ask.
So, I’m a writer, not a researcher, but I’ve found the more tools I have stuffed into my brain, the more likely it is that two different things clank against each other and create something interesting.
I don’t think this is something unique to writing fiction–from my understanding of history, there’s quite a few moments in science where two somewhat unrelated things bash against each other and spark a new idea.
Sure, computers can do things we already know how to do, but actual inventors/scientists/people making stuff still need to think up things first before you can computerize it.
It’s possible that this WON’T do anything new in the realm of math, but it might create a string a researcher in a different domain–history, linguistics, whatever–can pull on to unravel something else. A diverse tool set leads to multiple ways to solve a given problem, and sometimes edge cases come up where one solution actually is better in some niche application because of something unique to the way it is shaped.
You’re not wrong that people can take inspiration from many different fields, but wild speculation about what could happen can be done for any new development, which makes it pointless and tiring when overused.
Maybe I’m an idiot but how would a base 60 system with “Cleaner fractions means fewer approximations and more accurate maths, and the researchers suggest we can learn from it today.” make any difference when computers are powerful enough to generate solutions to answer with more accuracy than is ever needed in real world applications?
None, in modern context we can work in any base we desire, all that basic stuff got generalized ages ago. No one is going to change computing systems to use babylonian-style. And the trigonometry stuff is the same thing we knew, but discovered earlier than the greeks.
It’s a important discovery for sure, especially for our understanding of ancient Mesopotamian cultures, but everything else is the authors and the article going bananas with conclusions.
That’s kind of what I figured. I wish journalism didn’t need to be so incredibly sensationalist. I understand that it’s because the majority of the populace has the attention span of a gnat but it doesn’t make me feel any less annoyed by it.
deleted by creator
Rule 4 of the community forbids changing the headline. So you’d really need to convince the mods.
Computers still run different algorithms internally, some of which are more prone to having undetected errors than others:
https://en.m.wikipedia.org/wiki/Pentium_FDIV_bug
Computers use base 2, binary. Whether humans use base 10 or base 60 is irrelevant.
The algorithms coded into computers are not in base 2, though. Only operating functions of the computer itself are in base 2.
You don’t code in binary
Every thing you code is binary. You may write ‘15’, but the code your computer runs will use ‘00001111’. The base-10 source code is only like that for human readability. All mathematical operations are done in binary, and all numbers are stored in binary. The only time they are not is the exact moment they are converted to text to display to the user.
What do you mean algorithms are not in base 2? What else are they in?
Just because you have human readable code which uses base 10 doesn’t mean it isn’t all translated to binary down the line. If that’s what you’re referring to, of course. Under the hood it’s all binary, and always has been.
Because calculations happen in the form the calculation is written. The math is done in whatever base the algorithm is told to process in.
Okay, walk me through how you think the code an algorithm is written in gets processed by the computer step by step, please. How do you think a computer operates and is programmed?
Let’s say you have code to tell the computer to calculate 3 + 5, in, say, C, because that’s close to assembly. What happens on the technical level?
In sorry but you seem to be mistaking the fundamental distinction between what we are talking about.
Then explain to me the fundamentals you are referring to.
Because if you are wondering about how a computer processes information, I can tell you. I can then back it up with sources, and how code gets decoded into assembly, and how assembly is interpreted (spoiler alert, it’s binary, as it’s a direct representation of the digital circuits on the hardware level, which operate based on binary states, on and off). You just have to ask.
So, I’m a writer, not a researcher, but I’ve found the more tools I have stuffed into my brain, the more likely it is that two different things clank against each other and create something interesting.
I don’t think this is something unique to writing fiction–from my understanding of history, there’s quite a few moments in science where two somewhat unrelated things bash against each other and spark a new idea.
Sure, computers can do things we already know how to do, but actual inventors/scientists/people making stuff still need to think up things first before you can computerize it.
It’s possible that this WON’T do anything new in the realm of math, but it might create a string a researcher in a different domain–history, linguistics, whatever–can pull on to unravel something else. A diverse tool set leads to multiple ways to solve a given problem, and sometimes edge cases come up where one solution actually is better in some niche application because of something unique to the way it is shaped.
You’re not wrong that people can take inspiration from many different fields, but wild speculation about what could happen can be done for any new development, which makes it pointless and tiring when overused.
To generate most* solutions.
But I see your point.