For me, I find most things can be fascinating. There are so many domains I have zero personal, surface-level interest in, but have nuances that are super interesting.
When someone else has that spark, and their eyes sparkle, and they beam as they talk about "their interest"? Idk, I love that. It makes me feel good to hear them. I feel like we both come away better for the conversation.
> How can you make yourself genuinely care about something you don't care about? It sounds as plausible as changing your own sexual orientation.
Most people don't care about the gym but they care about their health and their health as they age so many learn to care about going to the gym even if they don't love every minute of their gym time. I'm one of those people.
1. You don't care about X until you do. Like, you can go for years without worrying cholesterol. And then you can have a reason to care about it and all of a sudden you do. The reason can come from something that forces your hand or just because you take an interest in a subject.
2. Altruism. Think less about care and more just doing without expecting anything back. People notice, especially with selfless conversation.
I genuinely care about my friend. He's really into bee-keeping. I don't care at all about bees. But he cares about it, so I ask questions because I care about him. I have now learned enough about his bee-keeping to be legitimately interested in whether, say, his bees survived the winter or to be upset with him that an invading swarm killed them.
The simple answer to your question, I think, is that you probably can't "make yourself" care about a specific thing at the drop of the hat. But if you care deeply about other things, especially tangential things, it's relatively easy to learn to care about new things you learn about.
Not sure what the downvotes are for on this one. It depends a lot on what "genuine care" is supposed to mean. If you want to interpret that as a subconscious feeling then you're right. Feelings aren't normally controllable and calling them up on demand is pretty much impossible.
That being said, if you go through a bit of game theory and apply it to the real world - the experience of the last few millennia of recorded history is the strategy most likely to get people what they want is lots of communication and setting up win-win deals for everyone. Someone who reliably offers win-win deals has a natural advantage over the more common person who thinks in terms of win-lose deals. Communities that make a habit of setting up win-win deals for their members have an overwhelming advantage over those that don't. If you tap in to that type of thinking it tends to translate into taking a real interest in how other people are going because it is easier to set win-win deals up if you know what their problems and goals are. And a sensible sub-strategy is making sure to be as kind as possible to everyone to get into the habit of thinking empathically and keep channels of communication as open as possible.
So if "genuine care" means you literally feel something... nobody has much use for your feelings, we can't tell what your feelings are anyway and you probably can't call them up on demand. If "genuine care" means you try to figure out what other people want and then help them get it then that's simply good strategy and most people should find their way to it if they think about it for long enough. Some people have to think a bit harder than others and there are a few rare maniacs who really just want to cause pain and suffering. The maniacs are bad news.
Just define "general" as "as general as allowed by math, physics, and practical limitations." Or use a conventional reading of AGI as a human-level intelligence (which we, naturally, have a working example of).
Yeah but if you do that, you have to then turn around and look at how all the goalposts keep moving around. That is what I was (originally) trying to get at, and why I phrased it like I did. If we truly had actual (artificial) general intelligence (or were close to it) we would already have a solid definition/benchmark (and it... Probably wouldn't be what you said, but something a lot more detailed/thorough). Right now both AGI and ASI is just... Whatever. "It earns a hundred billion dollars in revenue," "It can do anything a general human can do" (ignoring the shear amount of ambiguity alone in that), "It can do most tasks a human can do" (again, ambiguous: which human, which tasks, on and on and on).
North Korea likely is extremely safe when things like street violence and bullying are concerned. It's only unsafe for dissidents.
And you know, you can also ask people. In software there is a large population that grew up in the ex-USSR. Many of us still regularly visit the old country and talk to friends and family that live there. And we aren't all bots, despite what many seem to believe.
reply