↓
 ↑
Регистрация
Имя/email

Пароль

 
Войти при помощи
StragaSevera
6 июля 2016
Aa Aa
#методика_методов
Читаю интересное интервью Юдковского. http://blogs.scientificamerican.com/cross-check/ai-visionary-eliezer-yudkowsky-on-the-singularity-bayesian-brains-and-closet-goblins/
Копипащу сюда няшные цитаты:

"Literal immortality seems hard. Living significantly longer than a few trillion years requires us to be wrong about the expected fate of the expanding universe. Living longer than, say, a googolplex years, requires us to be wrong about the basic character of physical law, not just the details.
Even if some of the wilder speculations are true and it's possible for our universe to spawn baby universes, that doesn't get us literal immortality. To live significantly past a googolplex years without repeating yourself, you need computing structures containing more than a googol elements, and those won't fit inside a single Hubble volume.
And a googolplex is hardly infinity. To paraphrase Martin Gardner, Graham's Number is still relatively small because most finite numbers are very much larger. Look up the fast-growing hierarchy if you really want to have your mind blown, well, eternity is longer than that. Only weird and frankly terrifying anthropic theories would let you live long enough to gaze, perhaps knowingly and perhaps not, upon the halting of the longest-running halting Turing machine with 100 states.
But I'm not sure that living to look upon the 100th Busy Beaver Number feels to me like it matters very much on a deep emotional level. I have some imaginative sympathy with myself a subjective century from now. That me will be in a position to sympathize with their future self a subjective century later. And maybe somewhere down the line is someone who faces the prospect of their future self not existing at all, and they might be very sad about that; but I'm not sure I can imagine who that person will be. "I want to live one more day. Tomorrow I'll still want to live one more day. Therefore I want to live forever, proof by induction on the positive integers." Even my desire for merely physical-universe-containable longevity is an abstract want by induction; it's not that I can actually imagine myself a trillion years later."

"There is a misapprehension, I think, of the nature of rationality, which is to think that it's rational to believe "there are no closet goblins" because belief in closet goblins is foolish, immature, outdated, the sort of thing that stupid people believe. The true principle is that you go in your closet and look. So that in possible universes where there are closet goblins, you end up believing in closet goblins, and in universes with no closet goblins, you end up disbelieving in closet goblins."

"Human axons transmit information at around a millionth of the speed of light, even when it comes to heat dissipation each synaptic operation in the brain consumes around a million times the minimum heat dissipation for an irreversible binary operation at 300 Kelvin, and so on. Why think the brain's software is closer to optimal than the hardware?"
6 июля 2016
7 комментариев
Хэлен Онлайн
И ни слова по-русски(
tesey
Хоть бы тег какой рисовали, типа "Инглиш". А я бы скрывала... скрывала...
"типа "Инглиш"" - у меня одного ассоциация "Лорд Инглиш"?
tesey
Матемаг
Можно по-англицки. Без ассоциаций. Ну хоть как-нибудь!
Хэлен Онлайн
Матемаг
Кто такой лорд инглиш? 0_о
Если мой английский достаточно хорош, то в начале написано о том, что бессмертному жить тяжело.
Если исчислять "жизнь" триллионами лет.
И что психология такого человека должна в корне отличаться от той, которую мы знаем.
И если мы хотим жить так долго, то нам требуется другая струк... *зачеркнуто* нужно стать компьютерами. Но "компьютер", живущий триллионы лет накопит очень много памяти, которую надо где-то хранить.

(переводчик 90-го уровня, да)
Дальше стало лень/трудно переводить.
Тесей, просто ты с заглавной инглиш написала:)

Хэлл, Хоумстак:) Рид ит!
ПОИСК
ФАНФИКОВ







Закрыть
Закрыть
Закрыть