That video of him talking like your sketchy brother begging for money was pretty memorable. "We will make it loan! We pay you back in full!"
Australia is inexorably heading toward an existential crisis if it embarks upon the revision of syntactic configurations and the deliberate selection of lexemes, driven by the escalating lexicographical incompetence endemic within the broader populace, whose functional illiteracy now appears to be a pervasive epistemological impediment to even rudimentary textual engagement.
Considering that Canadian banks are as unsustainable as any business on planet Earth, it makes sense for them to also be promoting other unsustainable policies.
"Oh, you're such a big strong man, Mr. FEMA executive! Please tell me more about how you ignored counties that voted Red in 2020!"

Semantics is the study of meaning in language, symbols, or signs.
So if one were Anti-Semantic, one would reject meaning, including the meaning of (((brackets)))
It's just a dumb play on words.
So if one were Anti-Semantic, one would reject meaning, including the meaning of (((brackets)))
It's just a dumb play on words.
If you're "Anti-Semantic", that would mean.... putting words in (((brackets like this))) would be meaningless because you don't care when stuff adds meaning like that... Ironic, isn't it?
Most of my money has been spent on GOG in the past few years because most of the games worth looking at were made over 15 years ago.
(Shame they went woke, but at least it's DRM-Free so I can keep all my games locally)
(Shame they went woke, but at least it's DRM-Free so I can keep all my games locally)
People whose entire lives rely on fossil fuels who don't contribute anything to the material wealth of a society telling everyone else it's time to "get real".
Like many moronic short-sighted ideas by people who aren't as smart as they think they are, it'll lead to mass death. I call these people "genocide advocates" because if they get their way, a billion people almost immediately die.
The same sort of arrogant people actually got power in the Soviet Union and Maos China and the result was exactly as I said.
Ironically, the people calling for an end to an end to fossil fuels are literally bourgeoisie -- city people. They type on their keyboard made out of petrochemicals and look at their screens made of petrochemicals and sit in their chairs made our of petrochemicals in homes made of petrochemicals and write about how we need to end fossil fuels using electricity that even if it's produced with solar was in reality often made using fossil fuels because the devices were made in China, which burns more coal than every other country on earth combined.
Reality is that we can't rely on fossil fuels forever. Eventually there just aren't any more dead forests to burn and there's a real problem with things like climate. Thing is, it isn't a switch we can just magically flip. These people have no idea what the fundamental challenges of energy are. It's not something we magically do by 2030 because someone signed a paper. In fact, if they got what they wanted there'd be civil wars and decarbonization would become a dirty word for generations.
Like many moronic short-sighted ideas by people who aren't as smart as they think they are, it'll lead to mass death. I call these people "genocide advocates" because if they get their way, a billion people almost immediately die.
The same sort of arrogant people actually got power in the Soviet Union and Maos China and the result was exactly as I said.
Ironically, the people calling for an end to an end to fossil fuels are literally bourgeoisie -- city people. They type on their keyboard made out of petrochemicals and look at their screens made of petrochemicals and sit in their chairs made our of petrochemicals in homes made of petrochemicals and write about how we need to end fossil fuels using electricity that even if it's produced with solar was in reality often made using fossil fuels because the devices were made in China, which burns more coal than every other country on earth combined.
Reality is that we can't rely on fossil fuels forever. Eventually there just aren't any more dead forests to burn and there's a real problem with things like climate. Thing is, it isn't a switch we can just magically flip. These people have no idea what the fundamental challenges of energy are. It's not something we magically do by 2030 because someone signed a paper. In fact, if they got what they wanted there'd be civil wars and decarbonization would become a dirty word for generations.
I've pointed out many times that Trump is to the left of Clinton by a substantial margin.
- He supported Gay marriage on day 1 of his presidency (the first president in history to do so). Clinton signed the Defense of Marriage Act which defined marriage as between a man and a woman
- He didn't cut welfare, Clinton did.
- He didn't try to balance a budget, Clinton did.
- Trump signed the "second chance act" to reduce felonies, Clinton signed the 1994 crime act that increased felonies
- Trump didn't start new wars, Clinton was fully engaged in the world police thing.
Trump previously ran as a Democrat, and he's still essentially a democrat from 20 years ago. People take mean tweets and extrapolate that and a red R next to his name into a bunch of stuff he isn't.
- He supported Gay marriage on day 1 of his presidency (the first president in history to do so). Clinton signed the Defense of Marriage Act which defined marriage as between a man and a woman
- He didn't cut welfare, Clinton did.
- He didn't try to balance a budget, Clinton did.
- Trump signed the "second chance act" to reduce felonies, Clinton signed the 1994 crime act that increased felonies
- Trump didn't start new wars, Clinton was fully engaged in the world police thing.
Trump previously ran as a Democrat, and he's still essentially a democrat from 20 years ago. People take mean tweets and extrapolate that and a red R next to his name into a bunch of stuff he isn't.
Let's be honest, how many tabs are just bookmarks you intend to close after you visit?
For me.... It's lots. I have 32GB on my main machine for a reason, and it's probably to open more tabs.
For me.... It's lots. I have 32GB on my main machine for a reason, and it's probably to open more tabs.
They kind of have to because it's solely a speculative asset.
I did a huge post about this a day or two ago. People buy bitcoin because they think someday it will be all the money and they want to own a significant percentage of all the money. The problem is that this use case doesn't actually involve buying anything with bitcoin.
A currency according to Austrian economists is something that works as a unit of account (how much does an apple cost?), a store of value, and a method of exchange. Presently, bitcoin fails all 3. The first and second because the price varies so wildly, the third because you can't really buy anything with it other than dollars.
I proposed a system be integrated into bitcoin that would grant incentives for using bitcoin based on the total GDP of bitcoin. The idea would be that you'd want to increase the money supply as the amount of goods and services bought and sold using the currency increased so the relative value of one bitcoin stays the same in spite of more stuff you can do with it. You'd also want to decrease the money supply as the amount of goods and services bought and sold decreased, so you'd want to increase service fees and destroy some of the bitcoin with every transaction to lower the total number in circulation, and all of these decisions would be done using a model predictive controller utilizing the known blockchain data set.
I recognize that to make such changes arguably you wouldn't have bitcoin anymore since its defining features would be eliminated, but as things go it's just a thing you buy because you want it to go up.
I did a huge post about this a day or two ago. People buy bitcoin because they think someday it will be all the money and they want to own a significant percentage of all the money. The problem is that this use case doesn't actually involve buying anything with bitcoin.
A currency according to Austrian economists is something that works as a unit of account (how much does an apple cost?), a store of value, and a method of exchange. Presently, bitcoin fails all 3. The first and second because the price varies so wildly, the third because you can't really buy anything with it other than dollars.
I proposed a system be integrated into bitcoin that would grant incentives for using bitcoin based on the total GDP of bitcoin. The idea would be that you'd want to increase the money supply as the amount of goods and services bought and sold using the currency increased so the relative value of one bitcoin stays the same in spite of more stuff you can do with it. You'd also want to decrease the money supply as the amount of goods and services bought and sold decreased, so you'd want to increase service fees and destroy some of the bitcoin with every transaction to lower the total number in circulation, and all of these decisions would be done using a model predictive controller utilizing the known blockchain data set.
I recognize that to make such changes arguably you wouldn't have bitcoin anymore since its defining features would be eliminated, but as things go it's just a thing you buy because you want it to go up.
I don't have timelines at the moment, but I do have hope for at least 2 more books, one programming book and one hard science fiction book.
I always appreciate seeing your positive feedback too. I'm trying to be principled, but I'm still a human being and so seeing that people are interested in what I'm up to helps feel like I'm not just shouting into the void too.
I've started to realize that people are taking me up on my offer to ignore or block me if they don't like effortposting because I'm not gonna stop.
Probably for the best.
But in my view, there's only a few reasons to have discussions online.
1. To yell pre-packaged platitudes at each other for sport
2. To try to help the hours of our lives to tick away faster
3. To try to make the entire earth correct by correcting people one at a time
4. To become mutually better through putting ideas through the gauntlet.
I've actually done some of these myself. When I was younger I'd happily argue online for sport, or I'd be bored and it was a good way to pass the time. When I was younger, I was even foolish to think I could help change the way the world saw things.
Today, however, the only reason that makes sense to discuss things online is to try to become better yourself and help better the people you discuss things with. We are all so far from what we could be, and I think that's been intentional by powers larger than ourselves.
I'm thankful to everyone who engages in good faith, perhaps especially people who push back and force me to better explain what I mean, or better understand what I'm saying. Recently there's been quite a few people who did well forcing me to think more about certain things I took for granted or forcing me to clarify something. @Hyolobrika often asks one piercing question on posts and it's like "Well, I can see how without clarification it might look like I'm saying something I'm not"
I'm thankful to guys like @amerika who spend a lot of time and effort helping to explain worldviews that are fully alien to me, because how can you agree or disagree with that which you don't understand? I don't always come away agreeing totally, but often I come away with my worldview changed by exposure to ideas I hadn't explored myself.
When people interact with me and get a big wall of text, it might be easy to assume I'm just trying to stonewall or filibuster, but often it's actually me trying to work through ideas publicly, and often there's a lot of actual research behind the wall of text. It might seem like it's a stop in the discussion, but what's the point of continuing to discuss if we don't actually take a deep dive into ideas that could change everything?
Probably for the best.
But in my view, there's only a few reasons to have discussions online.
1. To yell pre-packaged platitudes at each other for sport
2. To try to help the hours of our lives to tick away faster
3. To try to make the entire earth correct by correcting people one at a time
4. To become mutually better through putting ideas through the gauntlet.
I've actually done some of these myself. When I was younger I'd happily argue online for sport, or I'd be bored and it was a good way to pass the time. When I was younger, I was even foolish to think I could help change the way the world saw things.
Today, however, the only reason that makes sense to discuss things online is to try to become better yourself and help better the people you discuss things with. We are all so far from what we could be, and I think that's been intentional by powers larger than ourselves.
I'm thankful to everyone who engages in good faith, perhaps especially people who push back and force me to better explain what I mean, or better understand what I'm saying. Recently there's been quite a few people who did well forcing me to think more about certain things I took for granted or forcing me to clarify something. @Hyolobrika often asks one piercing question on posts and it's like "Well, I can see how without clarification it might look like I'm saying something I'm not"
I'm thankful to guys like @amerika who spend a lot of time and effort helping to explain worldviews that are fully alien to me, because how can you agree or disagree with that which you don't understand? I don't always come away agreeing totally, but often I come away with my worldview changed by exposure to ideas I hadn't explored myself.
When people interact with me and get a big wall of text, it might be easy to assume I'm just trying to stonewall or filibuster, but often it's actually me trying to work through ideas publicly, and often there's a lot of actual research behind the wall of text. It might seem like it's a stop in the discussion, but what's the point of continuing to discuss if we don't actually take a deep dive into ideas that could change everything?
I've made this comparison before, but consider this with respect to the speed of technological advancements particularly in the area of computers.
1974 had the first commercially advertised computer that was at a home computer sort of price point. It had a tape interface and memory, but generally was not something that we today would consider to be a home computer.
By 1984, most of the 8-Bit computers that we know of had already been released. The Apple 1, 2, and Lisa had been released, and the Macintosh was released that year. The commodore 64 had been on sale for years. The IBM AT based on the Intel 80286 processor was released that year. The Atari 2600 had been released, had a renaissance, and caused the video game crash. In the ensuing crash Nintendo released their Nintendo entertainment system which was leagues above the Atari 2600, and as well as it's contemporaries the ColecoVision and intellevision.
By 1994, the 32 bit Intel 80486 which contained an integrated math co-processor on the DX model was relatively common. The video games doom and Wolfenstein 3D had already been released for many years, and descent for a fully 3D game have been released that year. The internet already existed, the Netscape web browser had already been developed to some degree, meaning that the World wide Web already existed. The super VGA video standard of the time supported up to 16 million colors at 24 bit color.
By 2004, the first to 64-bit processors had been released. Video cards had already ceased just being 2D accelerator cards and become 3D accelerator cards that could display triangles on the screen very quickly, and years earlier had become the graphics processing units first developed by Nvidia. By 2005, 3dfx had been born, lived, and died. Pixel shaders and vertex shaders were available on all new top of the line gpus. I do have a point of that in spite of 64 Bit having been released at this time, most consumer PCs were still 32-bit.
Here's where you can really start to see some of the stagnation take place, but the innovation moved from one product category to the other. From 2004 to 2014 things got incrementally better, and the top end technologies such as 64-bit and multicore became common in consumer pcs, the amount of RAM in a PC substantially increased, in 2005 you might have 128MB, in 2015 you'd often have 2gb. Besides that though, things had improved a little bit but not the same way. Compare any decade before that, and you can really see the difference. The one thing that had happened from 2005 to 2015 is the development of the entire mobile ecosystem. I have a MotoX 2013 still sitting in a drawer at home, and while it isn't perfect, it is shocking how usable it is even now. Big thing is, for the most part a computer from 2004 isn't great but a high-end one isn't so different from what you'd see in 2014.
Now we finally come from 2014 to today. The last 10 years is probably been the most disappointing 10 years since the 1970s. Most of my websites are hosted on computers made before 2014. My travel computer is computer made before 2014. Although it is cutting across the decade, my computer for gaming is pre-pandemic, and that 5-year-old PC is essentially state of the art. Instead of having a 4060 it has a 2060, but even rtx, as potentially groundbreaking as it is doesn't really matter all that much almost anywhere. You won't be able to run everything at high settings, but in terms of graphics a GTX 980 will still play virtually every game on the market today.
So in this context, you can really see where the sort of enthusiasm about the most advanced technologies just wouldn't be there anymore, because a lot of stuff is just slowed down. There's been some really exciting stuff on the software front such as the fediverse or nextcloud essentially bringing the sort of software that used to be solely proprietary and democratizing it, but once you realize the massive differences in previous decades compared to today there really isn't any comparison.
1974 had the first commercially advertised computer that was at a home computer sort of price point. It had a tape interface and memory, but generally was not something that we today would consider to be a home computer.
By 1984, most of the 8-Bit computers that we know of had already been released. The Apple 1, 2, and Lisa had been released, and the Macintosh was released that year. The commodore 64 had been on sale for years. The IBM AT based on the Intel 80286 processor was released that year. The Atari 2600 had been released, had a renaissance, and caused the video game crash. In the ensuing crash Nintendo released their Nintendo entertainment system which was leagues above the Atari 2600, and as well as it's contemporaries the ColecoVision and intellevision.
By 1994, the 32 bit Intel 80486 which contained an integrated math co-processor on the DX model was relatively common. The video games doom and Wolfenstein 3D had already been released for many years, and descent for a fully 3D game have been released that year. The internet already existed, the Netscape web browser had already been developed to some degree, meaning that the World wide Web already existed. The super VGA video standard of the time supported up to 16 million colors at 24 bit color.
By 2004, the first to 64-bit processors had been released. Video cards had already ceased just being 2D accelerator cards and become 3D accelerator cards that could display triangles on the screen very quickly, and years earlier had become the graphics processing units first developed by Nvidia. By 2005, 3dfx had been born, lived, and died. Pixel shaders and vertex shaders were available on all new top of the line gpus. I do have a point of that in spite of 64 Bit having been released at this time, most consumer PCs were still 32-bit.
Here's where you can really start to see some of the stagnation take place, but the innovation moved from one product category to the other. From 2004 to 2014 things got incrementally better, and the top end technologies such as 64-bit and multicore became common in consumer pcs, the amount of RAM in a PC substantially increased, in 2005 you might have 128MB, in 2015 you'd often have 2gb. Besides that though, things had improved a little bit but not the same way. Compare any decade before that, and you can really see the difference. The one thing that had happened from 2005 to 2015 is the development of the entire mobile ecosystem. I have a MotoX 2013 still sitting in a drawer at home, and while it isn't perfect, it is shocking how usable it is even now. Big thing is, for the most part a computer from 2004 isn't great but a high-end one isn't so different from what you'd see in 2014.
Now we finally come from 2014 to today. The last 10 years is probably been the most disappointing 10 years since the 1970s. Most of my websites are hosted on computers made before 2014. My travel computer is computer made before 2014. Although it is cutting across the decade, my computer for gaming is pre-pandemic, and that 5-year-old PC is essentially state of the art. Instead of having a 4060 it has a 2060, but even rtx, as potentially groundbreaking as it is doesn't really matter all that much almost anywhere. You won't be able to run everything at high settings, but in terms of graphics a GTX 980 will still play virtually every game on the market today.
So in this context, you can really see where the sort of enthusiasm about the most advanced technologies just wouldn't be there anymore, because a lot of stuff is just slowed down. There's been some really exciting stuff on the software front such as the fediverse or nextcloud essentially bringing the sort of software that used to be solely proprietary and democratizing it, but once you realize the massive differences in previous decades compared to today there really isn't any comparison.