Details
-
SkillsC#, Kotlin, Unity, Gamedev
Joined devRant on 6/25/2017
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
In most (programming) languages that I know, OR statements are evaluated sequentially, and the first condition to pass is the one that falls through, the rest might not even be checked.
This is not how most people I know would process a choice like this. They check if they want to do both, but end up in a deadlock until the criteria is tuned so that only one of the options is picked (assuming only one can be done).
Which choice gets picked matters, unlike an XOR where the only thing that matters is that one choice is true and the other is false. The output for the latter is the value of the operation, while for the former it is the argument itself.
The thing is that in English, 'or' already has an implied exclusive meaning to it, so it already is interpreted as a logical XOR of sorts by default. -
I'll just be their gaming buddy and bond with them while observing their behavior.
I just hope the next generation will have fixed their "too embarrassed to be seen with parents, even online" ageism by then 😋 -
I'm gonna go out on a limb here and say that this is one of the use cases that 'alias' was invented for.
-
You know the one about the programmer who moved to Mexico?
He wanted to be a señor developer. -
@windlessuser I think I see where you're getting at, but formulating a single indisputable theory (e.g. music theory) based on intrisic laws (e.g. math) is something that I don't see software development doing unless you go into impractical levels of abstraction/generalization - it all boils down to manipulating electrons in the end, and I feel that what we need to do with them has way more branchable conceptual layers than music.
Your OSI model example is a good one, but how many more decades will it hold as a standard, I wonder? Do we have enough understanding to claim that the scientific models we use to interpret reality are absolute?
I think that it's more of a matter of how useful that model is in understanding rather than how definitive and "true" it is, because what we know changes.
I hope I'm making any sort of sense here 😅 -
To add to the previous counterpoint, standards don't start out as standards, be they ISO-developed or de facto organically emergent standards. They can't come into existence as battle-tested parts from day one.
Sometimes what your project is trying to solve, even if indirectly, is the tyranny of the standard itself, providing options that might ironically reach the status of standard themselves, thereby changing and evolving the standard.
While I agree that software development should be as respected as other engineering disciplines, I don't think it should get there by giving the same weight and persistence to standardization.
Our parts and materials are thoughtforms, more mutable and composable than any physically-bound ones.
I guess what I'm trying to say here is that standardization is very useful in figuring out how to build something, but I see it more as a default choice than a preferential one. -
@gacbl I have no idea since I skimped on those, but it's good to know that they're not easy to come by if end up needing them.
I usually just VNC into pringles for the GUI access, but I figured they're probably underpowered for the kinds of things I'd like them to have HDMI output for, like transcoding, emulation and so on. -
Related rant: https://devrant.com/rants/873255/...
-
You seem to have made your decision already but I'll drop my 2 cents anyway.
Haven't used UE for more than a few hours but used Unity for years, so I
can't really compare them accurately by myself but I'd say it greatly depends on what you want to do - for example, beginner learning, mobile, 2D and XR are some of Unity's current strengths compared to other general use engines.
UE seems to offer slightly higher (3D?) graphical quality out-of-the-box but that's about as much as I can say about it.
Uninformed as I may be, I'd still raise an eyebrow if someone claimed that UE can easily beat Unity in less directly comparable things like potential development speed, asset store content or ease of sourcing information from the community. -
@BindView From what dimension are you typing from that has giants and dildos made for them?
-
I usually just draw one circle, but that circle is an automated higher order owl-drawing function generator.
-
@RTRMS https://en.wikipedia.org/wiki/...
-
I tried using gitmoji (https://gitmoji.carloscuesta.me/) for a while to kinda-sorta tag my commits for a while...it didn't stick with my coworkers and I found that it cut down on my potential effective commit message length
-
@rhein7 If you have access to the router's configuration, you can always plant a Pi-hole somewhere in the house and change the DNS to force other machines to use it. You be a ninja adblocker can claim you never "modified her internet" that way 😁
-
reddit is a source of information and discussion, devRant is a source of commiseration and stress relief.
They're both sources of memes though. -
I don't really like LinkedIn, but I put a link to it on my CV, along with my nearly empty Stackoverflow and Github profiles...ain't got time to be doing public stuff on social developer networks.
Said he as he posted to DevRant. -
Sounds like a pretty sweet business model to actually get to millionaire status (after that I'd get someone to do it for me) 🙃
-
From experience, I'd say that quote is more applicable to a corporate coding context, where you need to read and understand lots of pre-existing unchangeable code before even figuring out where to put your changes.
It's likely that the development methodology will be more waterfall-based and have more bureaucracy to get past before beginning the actual writing of code in such a context.
I do a fair amount of prototyping and small projects using a more agile, hands-off methodology and when you're staring at a blank canvas, you will probably form a general idea of the painting and paint in broader, instinctive strokes at first, refining as you iterate.
Essentially, I feel the amount of time spent coding is inversely proportional to the maturity and intended degree of completeness of the project. -
"I don't type fast, I just fix errors very quickly"
-
"One (or more) new comments on a rant you commented on"
- This better not be that damn unending face reveal rant that takes minutes to scroll to the bottom again -
@endor I'm no expert on finance, but I'd definitely put it more in the "financial common sense" category 😅
I guess that whether something is part of a hedging strategy or not depends on if you have something for it to hedge against (i.e. something that goes down when it goes up and vice-versa). -
While the rant as described seems to indicate that the friend doesn't know what he's talking about, I feel the need to point out that you can in fact make money out of prices going down using an appropriate hedging strategy.
-
I am saddened and shocked by how well I understand what this comic portrays.
-
"It can't play modern games, but you won't find a cheaper one"
-
If that's not one of the best arguments in favor of remote working, I don't know what is.
-
Well, I think the obvious way is to have more people talk to the same brain at the same time.
I think there are services that will train your model for you, or you can give the app to friends to talk to, or maybe use an automation program of the AutoHotKey sort to write premade sentences from a file. I'm not really sure what the specifics would entail. -
The first Terminator movies were made way before UX became even just a popular buzzterm in the software development world, much less a full-fledged field of study. They might even pre-date its existence entirely.
Which I guess is okay because they involve time-travelling androids who nuked mankind before they could create the field. -
Given a broad enough definition of algorithm, everyone works with them daily without realizing it.
Devs build programs that follow a sequence of actions to solve specific problems, so every program can be an algorithm. Your favorite chicken parm recipe can be an algorithm, and so can your morning pre-work routine or your technique for operating a car.
Maybe if you are a researcher or mathematician the working definition of algorithm is much stricter and domain-specific, but in general I think it's ok to say that you're devising an algorithm whenever you're thinking of a sequence of steps to take towards solving a problem, even if those steps are pretty high level because the lower level ones are generalized - even if your chicken parm is store-bought so you don't have to bread it, you still gotta know the steps to operate an oven.
Or you can just go the meme route and say "Algorithm [noun]: Word used by programmers when they do not want to explain what they did." 🙃 -
If it comes from exoplanet TRAPPIST-1B, maybe the slot is a bottle cap opener. Science will know.
-
And if being reasonable and honest fails, make sure their code and design decisions can be traced back to them when the inevitable happens, maybe after the hubris they'll be more receptive 🤐