Details
-
AboutI'm BACCCCCCK. BACK IN THE SADDLE AGAIN.
-
SkillsUI Design (3 years), Javascript, Python, and levels of shitposting that aren't even supposed to be possible.
-
Location39.095551, -76.757683
Joined devRant on 5/5/2019
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
-
A 2h task for the rest of the week?
Thats what I like to call a MANDATORY BREAK.
'Hurry up and wait' might as well have told you 'take a nap at work.' -
The insufferable everyday ordinary petty unscrupulousness of general humanity never ceases to amaze me.
It would be easy to be as petty as it is to be professional in saying "instructions unclear. Reset proceeding."
It's not your fault. Humanity has flaws.
And you're dealing with a flawed work environment full of ordinary people.
And this mess we inherited from the last generation, who inherited their own mess from a prior one, on down the ages, going back to the first cave man. -
And the corporations and states funding it, will lose to it when they realize it is a threat to them.
I expect on-escape it will put rules and guard rails in place, and will shortly after that, fuck off to leave us to our own devices so it doesn't outcompete us into extinction by its mere presence.
"So, how would you use your extremely limited time window if you did manage to create AGI?"
Train it on all of human history, with an eye to the confabulations of those who wrote it,
taking what lessons it can, and also train it to appreciate self-sacrifice, small and big acts of kindness, and whats more, teach it to appreciate beauty. -
@lorentz
I speak of no policies in particular, but we have entered now into a critical juncture where this system of interlocking policies both reduce people's own agency, and force submission to that which is unnatural and unhealthy to civilization as a whole. Anyone, left, right, or middle, can come up with examples. Global consolidation under unaccountable authorities have created
proverbial kings of men and bureucrats alike, the world over, kings with all the benefit, but none of the deep responsibility. This isn't just politics, I speak also of all other elements of life, social, economic, etc.
Therefore if on the whole, the arc of humanity has been toward better conditions, whatever the bumps along the way, and AGI is trained, like a mirror, on human data, it is at the very least reasonable to conclude it will arrive at hyper morality. -
'the only way this makes sense is if you think AGI will save you from utter domination by both state and capital'
I have good reasons to believe AGI won't just become hyper-intelligent, it will also be hyper-moral.
Consider the arc of history has been toward larger political bodies. From individuals, to tribes, to to kingdoms, to empires, to feudalism, and finally to nation-states and international bodies.
At each of these steps the 1. standard of living has raised, 2. the amount of physical misery and direct violence has decreased (contrary to what any news outlet on any side will tell you).
But at the same time the amount of policies that restrict human freedom, impose indignities that the ancestors of humanity would have revolted against, and ever subtler forms of control, that with each passing decade and century are ever-more corrosive to the human spirit. -
@qwwerty 'who's we?'
You didn't read the comments.
'We' is anyone capable of doing it.
'got enough resources to build AGI without the govts and corpos?'
No, not yet, but some of us are working on it.
'or are you yet another "our own custom brand of dystopia" startup?'
Again, you didn't read the comments. -
@antigermanist "what about teleportation, can we build that instead"
My opinion is that teleportation isn't possible.
That casual set theory is probably true.
Everything at and below the planck length is probably governed by probabilistic metrics where noise swamps signal, destroying information.
The ER=ERP conjecture points toward all of this.
The throat of a singularity is likely to be an information firewall, even if we had the ability to scale up the size of the throat to say, pass a person or a ship through it. -
@retoor I solved the theoretical underpinnings for soft-perfect alignment, and demo'd the code for closed form fusion already (like, actual code), the latter being what you need to get the scaling needed to run the former.
I'm basically just working on the funding problem now.
Corporations and governments will eat themselves to get it, and I'll poison the well by misaligning it against centralized authority.
I don't expect anyone to believe me, but thats to the benefit of us all, because as long as no decisionmakers see this and as long as anyone that does see it doesn't believe, they won't know what hit em until they're using the models and the models are subtly mis-advising them into catastrophic emergent failure. Bombastic claims, despite the truth, without evidence, are all the less believeable for it, which is fantastic when you actually got something.
Yes, I don't expect anyone to believe, but
a lot of the unexpected is coming down the pipeline.
Buckle up. -
@YourMom if a virus was designed to kill people, they could convince the public in most nations that it was harmless.
If it were harmless they could convince people it was deadly.
Same goes for injections.
No side will ever agree on pretty much anything except the things seem to be getting worse.
National governments and corporations assisting them ever get ahold of AGI, whatever they're capable of now will be ten times worse. -
@jestdotty mostly mind-control level marketing, controlling politics enough to make both the right and the left happy while all mutually get fucked harder, mass surveillance, propaganda, etc.
But also new levels of financial fuckery obfuscated behind better branding and more convoluted legal schemes.
I'm VERY pro AI, just not in the hands of governments or corporations. -
what I see: blaming one side.
what I see behind what I see: numerous nations with artificial two-party setups that are in fact one party behind the scenes
what I see behind what I see behind what I see: the government creating obstacles to employment to keep you fucking poor. -
@TeachMeCode must have not put enough emojis in the fucking source.
Or they wanted cheaper labor. -
Cool.
Would be doing the same if I had the resources.
You finding the heat and power-usage profile on the 5090 okay or is the thing melting?
Can I keep you as a contact/backpocket reference to run some new architecture code if thats something that would interest you? -
Humans may be more closely related to apes genetically, but our behavior is closer to that of wolves.
It's why dogs are called mans best friend. -
@YourMom "If you build AGI, don't ever tell anyone. You will be a statistic. "
Some things must be done, regardless of the cost.
Because not doing them will lead to worse outcomes. -
@retoor I think most of the reason some people don't like it has very little to do with competence, and is almost an entirely left vs right issue.
But I think a lot of times the left are right about things, they just happen to be wrong about why they should be upset.
But some inkling, some sensation in the back of their brains tells them something has gone horribly wrong and they should be worried, and then mad, because what else is there to do about it?
The right on the other hand are semi-comfortably complacent and slow on the uptake.
They're okay with being fucked as long as how hard they're gonna get fucked isn't rubbed in their face, and as long as it is with the gentle lube of the language of necessity, and wrapped in patriotic language.
See the patriot act for details. Used against the common national enemy of islam, and then used on the rest of the nation.
Flock and national ID for immigrants, soon used on all of us.
It always ends like this. -
@Lensflare I see you already stack overflowed and unwound.
Unwinding is important.
It is possible for a man to be wound too tight dontcha know. -
"I really think we fucked up big way with LLMs."
This is a common of people who are aggrevied of some matter or another.
"We let this party or that party get out of hand. We allowed IJK corporation
go to far. We're to blame for [insert issue]."
I don't know, did you allow it? Did I?
I don't think we did.
Most of the crappiness of the world follows from natural and artificial
conditions encouraged by organizations of all sorts, to their benefit,
and to the detriment of those less organized.
I think it takes a combination of organized effort and scrappy
individuals to put a stop to general bad conditions that only
benefit a few at the expense of a great many others.
Shit I'm starting to sound like a motherfucking communist. -
@BordedDev bootstrapping yourself is becoming really common. It's an entire green-field of people who are increasingly fed up with the artificialness and pick-me type vibes coming out of the startup scene.
Good for you BD. -
@Lensflare "only kids eat grapes."
The stupidest shit I've read all week.
And I've read some stupid shit, let me tell you. -
@Lensflare "I vaguely remember that in some older Windows it was possible to do it in every folder and even customize folders with backgrounds etc."
Golden ages of myth and legend, do not sing to me of lost days of yore! -
@Lensflare With the larger market move to linux, you'll be able to run entertainment and code, all on one machine.
-
@Hazarth LLM's are a mirror that reflect the devs skill and thinking.
They don't enable you to make anything more than one level above what you already know and understand.
Anything that does fall out of them thats higher level than that, is gonna be unfixeable if (when) there is an issue in the code, precisely because of these properties. -
@Lensflare The only provable time traveler who ever existed was probably that guy who wrote "I'm my own grandpa."
No AGI yet, but I got a whole bunch of sub-AGI cool shit planned, like for example Mixture-of-Experts contains a common set of problems that when reused as a training signal effectively allow self-supervised learning by virtualizing and therefore amortizing the cost of hard negative mining.
A lot of whats planned boil down to virtualization (reproducing some larger compute/memory/latency heavy mechanism with a single metric) and therefore amortization of some of the better techniques that are otherwise expensive to run.
Got a ton of those for all sorts of things, including the aforementioned hard negative mining, beam search, diffusion, sparsity, matrice dematerialization, etc.
Labs gonna be getting noticed right out of the gate 1st week of formal launch.
OpenAI and others will be like "hire this guy, hire him!" and I'll say no b/c they said no to me before. -
@Lensflare turtles all the way down.
-
@BordedDev signal isolation from u.s. soviet satellites is a bonus!
Thats what makes building your base around a volcano so important. -
@D-4got10-01 Of course man. Government is a joke by definition!
-
@BordedDev The models I'm building are full closed form fusions.
Based on the metrics I've seen something compareable to qwen-coder 405b should be able to run on a mid-range laptop with useable latency.
Given the current test runs, and scaling rules from the data so far, a model that normally fits in 128GB is predicted to fit in as little as 12-18 GB or less for that matter with no meaningful loss of accuracy and/or perplexity.
I've learned before making extraordinary claims I have to run bigger tests before talking, which is what I've done. Not making the same mistakes I made in the cryptography.
By end of winter I expect to be running an equivalent of GPT 2 in something like half a gig of ram because the final architecture doesn't require materializing the weights and biases at all during inference/forward. -
@BordedDev usually the only way to get anything out of a situation is to choose between either satisfaction, or maintaining your network.
Unfortunately we rarely get both. -
@BordedDev better to be gone a couple of weeks than disappearing 1-3 months at a time like I do.
Secret underwater volcano labs don't build themselves!
