New proof dramatically compresses space needed for computation

baruchel | 183 points

Here's the gist:

For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

zerof1l | 20 hours ago

One of my favourite sci comms YouTubers explained this in great detail https://youtu.be/8JuWdXrCmWg

Highly recommend

cyrillite | 19 hours ago

Related:

For algorithms, a little memory outweighs a lot of time. 343 points, 139 comments, 39 days ago. https://news.ycombinator.com/item?id=44055347

You need much less memory than time. 126 points, 11 comments, 22 days ago. https://news.ycombinator.com/item?id=44212855

JohnKemeny | 18 hours ago

I get that this is an interesting theoretical proof. I'm more interested in the inverse, trading memory for time, to make things faster, even if they take more space. It seems to me the halting problem is almost a proof the inverse of this is impossible.

Memoization is likely the best you can do.

mikewarot | 19 hours ago

One thing that's surprised me throughout my career is just how inefficient most of the code that I've inherited is. I suspect we could do a lot more with the hardware we have if we simply became better at programming.

(Perhaps more AI coaching could help?)

gwbas1c | 15 hours ago

[ made this a top-level comment as I don't see it mentioned in other comments: ]

The result doesn't actually apply to _all_ algorithms, only oblivious ones. Oblivious algorithms are ones where step i does not depend on decisions made in steps j < i. Almost all useful algorithms are non-oblivious, with some notable exceptions [1]. This work is a step forward towards proving the result for the general case, with little practical usage (and that's perfectly okay).

[1] https://en.wikipedia.org/wiki/Sorting_network

utopcell | 7 hours ago

I don't understand what they are saying at all. If I have a computation that requires going through a loop n times, why should the memory requirements increase with n at all?

gweinberg | 11 hours ago

This seems very theoretical, just a lower bound on space required, without talking about what is being computed. Does it have any import on real algorithms?

kristianp | 19 hours ago
[deleted]
| 15 hours ago

Here's a quote from the SciAm article: "Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small."

Huh?

bluenose69 | 17 hours ago

>(Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small.)

My dude, that is NOT how rational numbers work.

snickerbockers | 8 hours ago

[dead]

aaron695 | 2 hours ago