Details
-
Aboutovercaffeinated, autodidact, CS student & GWT Web Dev interested in the Semantic Web, Data, and bio-informatics. My ideal job is as a Lisp/Python Developer || Tech-Journalist (my personal website isn't mobile optimized yet, yikes)
-
SkillsC/C++, Java, Common Lisp, Python, Javascript (VanillaJS/Vue/GWT), Semantic Web, Linux, Data Queen
-
LocationNC
-
Website
-
Github
Joined devRant on 5/23/2017
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
I swear I'll snap if someone tells me it's weird that I resize applications to be taller than they are wide. I keep them that way because widescreen monitors came into existence when computers became mainstream and the market shifted to the plebs who only used them to watch videos and wanted to not see any bars on screen, and now we all have to suffer.
Web pages are organized vertically so it makes no sense for me to browse the web in full screen, it wastes space where otherwise all the content is contained and distraction free, most pages strip the side-bars so you'll also see a few less ads. I can also use and organize multiple apps how I want. Small thing too but browsing the web in full screen means pages can find the exact dimensions of your screen and learn more about you (I don't care about this but it's also worth mentioning).
I promise you there are so many good reasons to not use apps in full screen.
Thank you for coming to my Ted Talk.17 -
Coworker just showed me how he avoids merge conflicts and I'm undecided on it. We use feature-branch workflow, so if a feature takes a long time to finish, it may mean merging master multiple times. He avoids it by stashing changes instead of committing them, then when he needs to merge master into the branch it's still clean. When the feature is done and he's ready to commit, he pops the changes and git diff shows all the changes before you push and you just change what you need instead of being forced to use the horrible merge software.
There must be problems with this, right? This seems too easy for it not to be the standard.5 -
Dude in my Calc 2 class just bitched about iPhones having "shitty software" referencing that bug from around ~6 years ago, when a specific iMessage text would reboot your phone. IMO, 99% of what Apple does well is software. UI is subjective, but final cut pro is unbelievable in terms of functionality for its price, their software is so well optimized that iPhones have been able to use comparably tiny batteries and still compete. They are consistent throughout their company with software design, while companies like Google are so stratified it took years before their material design had been implemented in all their services, there are still a few that aren't (not to mention the meme of Google killing off all their projects). I hate tablets, but the iPad pro has the best software/hardware implementation of any I've ever seen. Apple's interconnectivity between devices is unbelievable, whether it's Continuity features or the setup process just recognizing group devices around and pulling data to create consistent account info and saving you taps. Siri is shit, but apart from that their software isn't bad enough that you should complain about that instead of...
Their Macs are fucking pressure-cookers, and their fuckin marketing department is like a different company all-together, and their anti-fix-it-yourself policies are so user hostile that they're toe-to-toe with being as abusive to customers as Oracle.
TL;DR the biggest scam Apple has pulled off is not that the sheep still think Android and PC users are living in 2010, but they've convinced the sheep that they know what shitty software is. At that point they're too many levels deep and there is no red-pill strong enough for them.2 -
Lisp is such a cool language, and I feel like because functional programming is becoming so popular, Lisp could end up the go-to language because it's so versatile and, though there are many parens, it's friendlier at first glance than Haskell. (And there are so many libraries for it, omg)6
-
If you type capital letters by hitting the caps-lock key twice, you don't deserve to use computers.9
-
I don't mind Apple marketing themselves as these revolutionary thinkers and innovators, because I figured most people see behind the marketing but appreciate Apple for what it is. It's a big company that makes well built products, that are efficient and give good support to those products.
But I'm sick to death of tech journalists talking about how every new feature is the death of Android. They have to be kidding themselves if they think what Apple's doing is innovating. Samsung's been designing screens for the bezelless market for a LONG time, and their technology in that is incredibly advanced (it's why if you use their iPhone x you'll be looking at a screen from Samsung!)
They finally adopted wireless charging and pretended it was brand new, but I remember when they came out with the Apple watch, marketing it like they'd broken ground when Android Wear watches had been out for a year!
I don't want people to think I hate Apple, I own a few of their products. I think they're remarkably invested in user privacy; homekit imo is one of the most forward thinking implementations of smart home technology that I've seen, and the new processor in the iPhone x is a Mammoth powerhouse. So, I'm not necessarily saying anything about that, but what I am saying is that they're iñcredible at marketing, but fanboys but are not self-aware can enough to recognize when the Designed-by-Apple hype over shadows the actual objectivity or the situation. There are articles already talking about Apple's wireless charging.
TL;DR I swear to god if an apple fanboy comes at me saying the bezelless design was Apple's innovation, I'm going to snap. I appreciate what Apple does well, but unfortunately people can't appreciate a product without needing to identify with it.6 -
Before starting to program I was impatient with the progress of technology, now that i've started learning to program I'm intimidated by the progress in the tech industry.1