Found someone nice. It was sheer chance, really. Met with a new neighbor and she had a crush on me. Was friends for a while. Years later decided to get into a relationship with her.
Found someone nice. It was sheer chance, really. Met with a new neighbor and she had a crush on me. Was friends for a while. Years later decided to get into a relationship with her.
Most PhD’s in university actually prefer to be called by their first name. As a graduate student, one of the most jarring culture shocks is to learn to call professors by their first names. At least that’s the case in the US, not sure about elsewhere
It’s a multifaceted answer for me, I feel.
Linux is weird, on a technical level. It’s funky and broken and has weird quirks you have to remember. But it’s not malicious. Wendel from Level1Tech said it best in one of his videos: the headaches with Linux are haphazard, the headaches with Windows are adversarial.
It’s not a perfect solution to Windows, but at least for some people, the respect that it has for its users (ie, no ads, not trying to fight you on everything you’re trying to do, gives you the ability and freedom to tinker as you please) offsets its technical problems.
Additionally, Linux is missing a lot of core applications. There’s many applications that do have a Linux version, and many that can run through a compatibility layer, and out of those that are left, many have really solid replacements. Heck, you might be surprised to find that some of the software that you use already were originally intended to be replacements for Windows-only applications.
But there’s still a handful of core applications that don’t work on Linux and don’t really have a good replacement, and even missing 1 can easily break someone’s work flow. No, LibreOffice isn’t a full replacement of Microsoft Office, no, GIMP can’t actually replace Photoshop.
As for terminal, there’s no way around it. You will have to open terminal at some point. To be clear, most, if not all, things that you might imagine yourself doing likely has some way of doing it through a GUI. The issue is that as a new user, you don’t know where the GUI is, or what it’s called, or how to even ask. And when the tutorials that you find online tell you to just use terminal, that ends up being the only practical way of getting things done. So it’s a weird Catch-22, where only experienced users who know where all the menus are will know where the GUI options are, but it’s the new users who need it the most.
My understanding is that Linux developers in the past several years have been explicitly trying to make the OS more accessible to a new user, but it’s not quite there yet.
Overall, I think Linux is deeply flawed. But seeing how Microsoft seems to be actively trying to make Windows worse, Linux ends up being the only OS where have faith that it will still be usable in 2 years.
If anything, the more people switch to Linux, the more pressure there will be to make the OS more accessible to new users, and also for software companies to release a Linux-compatible version of their software. Some brave people just need to take the dive first
Patriotic duck
For me, everything is a belief unless it satisfies the following criteria:
I find that the one that trips up most people is #3, since some people speak in technically true but overly broad statements and the listener ends up filling in the gaps with their own biases. The listener leaves feeling like their biases have been confirmed by data, not realizing that they have been misled.
In the end, according to my criteria, very little can be categorized as true knowledge. But that’s fine. You can still make judgements from partial or biased data or personal beliefs. You just can’t be resolute about it and say that it’s true.
The Thing (1982) has basically consistently been my favorite horror movie
I mainly do work indoors, so the brightness does not really matter that much to me. But as far as I can tell, the brightness is pretty normal for laptops - I don’t think it’s any brighter or dimmer than other laptops I’ve used in the past. According to this website that I found, brightness is 25 to 486 nits. Google search seems to say that average maximum brightness for laptops is somewhere around 300-400 nits.
My understanding is that the screen is generally what eats up most of the battery on device, so if you plan to have brightness turned up, it might be difficult to find a laptop with a long battery life.
The cpu is on the mainboard and can’t be removed, but you can replace the entire mainboard. Basically, you can upgrade, but you’ll have to upgrade a couple other things along with it
Just tested with normal power profile and screen brightness turned down - battery went down by about 50% after 3 hours. I think my laptop usually dies after 3 hours because I have the screen brightness up
Yes, but that’s my point, you see. Because Arm historically has been used for mobile and small devices, there’s been a strong incentive for decades to emphasize power efficiency. Because x86 historically has been used for desktops, there’s been a strong incentive to emphasize power. It’s only been very recently that Arm attempted to have comparable power, and even more recently that x86 attempted to have comparable power efficiency.
Sure, Arm is currently more efficient, but the general consensus is that there’s no inherent reason for why Arm must be more efficient than x86. In other words, the only reason it is more efficient is just because they’ve been focusing on efficiency for longer.
Both AMD and Intel’s current gen x86 cpu’s are, from what I can tell, basically spitting distance away from Qualcomm’s Arm cpu’s in terms of battery life, and rumor has it that both x86 companies should be able to match Arm chips in efficiency by next gen.
So if efficiency is a priority for you, I think it’s worthwhile to wait and see what the cpu companies cook up in the next couple of years, especially as both AMD and Intel seem to be heavily focused on maximizing efficiency right now
My understanding is that Arm chips don’t have any fundamental advantage over x86 chips. They’re more efficient simply because they’ve been optimized to be more efficient for so long. I’ve heard that upcoming Intel and AMD chips could be able to compete with the new Arm cpu’s, so if you’re not going to get a new laptop soon, it seems worthwhile to just wait and see
Yes, I don’t use the external GPU. I just use the AMD APU. Also I realized that AMD 7000 could refer to both the cpu and the GPU. Ah, AMD and their marketing
Kubuntu on Framework 16 AMD 7000 series here. Sleep is horrible - definitely drains your battery. Bag heats up, and I estimate maybe a 1% drain per hour. I’ve enabled hibernate though I rarely use it.
Battery is alright but not great. I get maybe 2-3 hours of active, light use from full battery.
No compatibility issues that I’ve noticed, though, of course, Linux has its fair share of minor non-hardware-related bugs.
Camera is serviceable but not amazing. Not sure about microphone but I assume the same thing. Speakers are somewhat odd in that the speakers are pointed to the side rather than toward the front, but again - serviceable.
I think having higher frame rates isn’t necessarily about whether our eyes can perceive the frame or not. As another commenter pointed out there’s latency benefits, but also, the frame rate affects how things smear and ghost as you move them around quickly. I don’t just mean in gaming. Actually, it’s more obvious when you’re just reading an article or writing something in Word. If you scroll quickly, the words blur and jitter more at low frame rates, and this is absolutely something you can notice. You might not be able to tell the frametime, but you can notice that a word is here one moment and next thing you know, it teleported 1 cm off
Well, I’m just covering my bases. I can’t personally imagine an instance in which it would be helpful, but human nature (and especially human nature of children) can be really hard to predict and I won’t deny that I might have missed a case in which it could be helpful.
My upbringing was extremely “do what you want, but deal with the consequences.”
“You can watch an R-rated horror movie, but don’t come to me if you can’t sleep at night”-type of situation.
My impression has generally been that my freedom to do what I want let me learn a lot about decision making, responsibility, curiosity, and modern survival skills like Googling things. I’m genuinely baffled by how poorly some people my age use the computer and find things on Google, and I somewhat suspect many of them probably simply haven’t had the opportunity to explore technology on their own. And a lot of my hobbies were developed exactly because I allowed to do what I wanted when I was a child.
As for children doing stupid shit and searching up things that aren’t appropriate for their age, my thought has generally been, why is it the parent’s role to keep that from the child? I strongly believe that a parent’s role is to prepare the child to be a functional adult, not to baby them.
I acknowledge that all children are different, and perhaps there are some cases in which having parental controls would help. But I think my life would be duller if I were raised with parental controls.
Edit: having read some of the other comments, I think there’s 2 aspects to the question of parental control. The first is the aspect of children learning about age-inappropriate things, which I’ve mainly been focusing on. The other is the aspect of discipline and management (ie, preventing your children from spending 12 hours on YouTube). I think people have made interesting points about this aspect, and I respect their opinions. I personally agree with BananaKing’s take that parental controls is the wrong tool for the job. Train your children properly and you shouldn’t need to use parental controls to control their screen time.
I liked Noroi: The Curse.
No jumpscares, but really quite unsettling
You can’t outright, but you can at least try to minimize your exposure. Easiest way is to avoid buying products that use plastic packaging, especially if the product that you’re planning to buy is food. Don’t microwave plastics, even the supposedly “food safe” one - that releases a ton of microplastics into your food. Don’t order takeout - again, lots of plastic in the containers. Even paper food containers contain a plastic coating.
Don’t touch receipts, especially with wet hands. Or at minimum, wash your hands thoroughly after touching it
I don’t particularly like the layout of libreoffice, but I find that onlyoffice works for me. Not as feature rich, I suspect, but it doesn’t disrupt my workflow due to how similar it is to Microsoft Office
Without another name change, I don’t think that phrase will ever go away, for the simple fact that X as a name is too short and nondescript. In speech, X could refer to a someone you broke up with, or it could just be the beginning of another word, serving as a prefix. In text, it could refer to the actual letter itself, or the close button on a window, or a placeholder, or something NSFW.
There’s simply too many ways that X can be interpreted that even if people associate Twitter with X, people will still specify “formerly Twitter” just to avoid confusion