Considering AGI and contemplating life in 202310

  • Wow, I really can’t imagine a future where humans are researchers in HCI (blu3mo).

  • https://twitter.com/blu3mo/status/1715208339733303491

  • I can’t imagine a future where humans are researchers in 10-20 years.

  • I wonder how other people who are considering a similar path perceive this.

    • Replacing “researchers” with “a game where the evaluation doesn’t change depending on who did it ∧ humans are weaker than computers” seems to work.

      • I thought about this quite a bit when writing the definition.
      • The evaluation doesn’t change depending on who did it.
        • “Winning against 100 people in shogi” or “Running 100m in 5 seconds” have value if humans do them, but they have no value if computers or machines do them.
        • “Producing research results” or “Developing software” have value whether humans or computers do them.
        • In this context, “value” refers to various important aspects of life, such as “value to oneself”, “value to others in one’s community”, or “economic value in one’s economic sphere.”
          • If you can win against 100 people in shogi, you would probably be happy, people around you would praise you, and you would also earn money.
  • https://twitter.com/blu3mo/status/1716481148904210583

  • Certainly, there seem to be cases where humans are partially necessary for a while.

  • However, I feel that a world where humans are recognized for driving research as players in terms of social and economic value will not last long.

https://twitter.com/blu3mo/status/1715452348414177407

I feel that it’s tough not in terms of researchers’ motivation but in terms of economic value. (If humans and computers can do the same research, I think those who provide research funding would choose the cheaper option.)