<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: AkBKukU</title><link>https://news.ycombinator.com/user?id=AkBKukU</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 23 Apr 2026 08:53:25 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=AkBKukU" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by AkBKukU in "ChatGPT Images 2.0"]]></title><description><![CDATA[
<p>>  can't afford to pay one for whatever business I have<p>At small scales what "art" does your business need? If you can't afford to hire an artist (which is completely fine, I couldn't for my business!) do you really <i>need</i> the art or are you trying to make your "brand" look more polished than it actually is? Leverage your small scale while you can because there isn't as much of an expectation for polish.<p>And no, a band poster doesn't have to be a labor of love. But it also doesn't have to be some big showy art either. If I saw a small band with a clearly AI generated poster it would make me question the sources for their music as well.</p>
]]></description><pubDate>Tue, 21 Apr 2026 23:04:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47855944</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=47855944</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47855944</guid></item><item><title><![CDATA[New comment by AkBKukU in "DaVinci Resolve – Photo"]]></title><description><![CDATA[
<p>I've been editing on linux with resolve since the launch of the BMPCC4K in 2018, were you trying to import MP4 footage? BMD can't be bothered to pay for the AAC audio codec for linux users even if they buy studio. So if you pay for studio you can read the h264 video stream but not the AAC audio. I end up converting everything to MOVs with pcm_le16 audio as a workaround.<p>The ALSA issues are beyond aggravating at this point. You do not want to actually run ALSA directly, you need it to connect to pulseaudio on 24.04. But I still have never been able to record audio within resolve. I've had mixed luck on newer wayland+pipewire setups with having to install the bridge packages to connect the different backends. Linux audio is cursed on its own so I don't fully blame BMD.<p>I exclusively run Kubuntu and have been using makeresolvedeb[1] for installing resolve and it has been pretty good.<p>[1] <a href="https://www.danieltufvesson.com/makeresolvedeb" rel="nofollow">https://www.danieltufvesson.com/makeresolvedeb</a></p>
]]></description><pubDate>Tue, 14 Apr 2026 17:45:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=47768792</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=47768792</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47768792</guid></item><item><title><![CDATA[New comment by AkBKukU in "I replaced Animal Crossing's dialogue with a live LLM by hacking GameCube memory"]]></title><description><![CDATA[
<p>> Some games are designed around content and "extraction". Many are not.<p>While I think the parent post leaves a lot of open ended questions, I think they are spot on about the tightness of design in games.<p>In many open world RPGs, or something like GTA, you cannot open every door in a city.
In street fighter you can't take a break to talk to your opponent.
In art games like Journey you cannot deviate from the path.<p>Games are a limited form of entertainment due to technical and resource restrictions, and they always will be. Even something as open ended and basic as minecraft has to have limits to the design, you wouldn't want the player to be able to collect every blade of grass off of a block just because you <i>could</i> add that. You have to find the balance between engaging elements and world building.<p>Having a LLM backed farmer in an RPG that could go into detail on how their crops didn't grow as well last season because it didn't rain as much seems good on paper for world building. But it is just going to devalue the human curated content around it as the player has to find what does and does not advance their goals in the limited time they have to play. And if you have some reward for talking to random NPCs players will just spam the next button until it's over to optimize their fun. All games have to hold back from adding more so that the important parts stand out.</p>
]]></description><pubDate>Wed, 10 Sep 2025 14:13:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=45198009</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=45198009</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45198009</guid></item><item><title><![CDATA[New comment by AkBKukU in "Game preservationists say Switch2 GameKey Cards are disheartening but inevitable"]]></title><description><![CDATA[
<p>> Yes - some things go out of fashion for a while, but trends almost always cycle back.<p>Exactly, this is even supported by Nintendo's own services offering emulation of their older systems. There is clearly demand for the ability to older games.<p>Capitulation to an "inevitable" fate of download only games is just taking the easy way out by not sticking to your own core values. I have personally pre-ordered a Switch 2, but I will not being purchasing any online only cartridges or download only software.<p>We haven't had the watershed moment that brings it into focus for gamers at large yet, The Crew was close. But Nintendo has kept the download servers going for all of their systems which has provided a false sense of security. Once those start being shut down maybe we'll see some actual response. Though with the introduction of Gamecube emulation on the Switch 2, they are only a small step away from emulating the Wii and giving people another scapegoat for their lazy acceptance of lack of ownership.</p>
]]></description><pubDate>Thu, 01 May 2025 14:26:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=43858134</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=43858134</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43858134</guid></item><item><title><![CDATA[New comment by AkBKukU in "An image of an archeologist adventurer who wears a hat and uses a bullwhip"]]></title><description><![CDATA[
<p>This was decided in court[1] over two decades as acceptable fair-use and that thumbnail images do not constitute a copyright violation.<p>[1] <a href="https://scholar.google.com/scholar_case?case=13767420941977220880&q=kelly+v+arriba+soft+corp&hl=en&as_sdt=6,39" rel="nofollow">https://scholar.google.com/scholar_case?case=137674209419772...</a></p>
]]></description><pubDate>Fri, 04 Apr 2025 01:36:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=43577467</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=43577467</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43577467</guid></item><item><title><![CDATA[New comment by AkBKukU in "Linux Pipes Are Slow"]]></title><description><![CDATA[
<p>It looks like FFmpeg does support reading from sockets natively[1], I didn't know that. That might be a better solution in this case, I'll have to look into some C code for writing my output to a socket to try that some time.<p>[1] <a href="https://ffmpeg.org/ffmpeg-protocols.html#unix" rel="nofollow">https://ffmpeg.org/ffmpeg-protocols.html#unix</a></p>
]]></description><pubDate>Mon, 26 Aug 2024 02:17:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=41353424</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=41353424</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41353424</guid></item><item><title><![CDATA[New comment by AkBKukU in "Linux Pipes Are Slow"]]></title><description><![CDATA[
<p>I have a project that uses a proprietary SDK for decoding raw video. I output the decoded data as pure RGBA in a way FFMpeg can read through a pipe to re-encode the video to a standard codec. FFMpeg can't include the Non-Free SDK in their source, and it would be wildly impracticable to store the pure RGBA in a file. So pipes are the only way to do it, there are valid reasons to use high throughput pipes.</p>
]]></description><pubDate>Mon, 26 Aug 2024 01:52:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=41353290</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=41353290</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41353290</guid></item><item><title><![CDATA[New comment by AkBKukU in "How CD pregaps gained their hidden track superpowers"]]></title><description><![CDATA[
<p>Look into the Domesday Duplicator project for Laserdiscs as an example of how what ssl-3 is talking about can be done using a high sample rate input. That exact process is possible and with enough storage and processing power can be used to get the most "low level" access to the data. It is not for the faint of heart though, and can take around 1TB of storage and hours of CPU time to process full movies in this way, I know because I've done it.<p>I believe I've seen there is work being done to attempt this on CDs but it would have still been in the exploratory phases and not yet ready to start archiving with. It might seem like overkill to do this to something meant to be digitally addressed but I've experienced enough quirks with discs and drives when ripping that I would 100% be willing to switch over to a known complete capture system to not have to worry about it anymore. Post process decoding also allows for re-decoding data later if better methods are found.</p>
]]></description><pubDate>Wed, 10 Jul 2024 14:17:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=40927176</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=40927176</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40927176</guid></item><item><title><![CDATA[New comment by AkBKukU in "How CD pregaps gained their hidden track superpowers"]]></title><description><![CDATA[
<p>Even BIN/CUE is not enough. It cannot store subchannel data like CD+G and is only able to hold a single session which breaks bluebook CDs with audio and data.<p>We do not currently have a widely supported CD standard for storing data from a CD that can properly hold all data. Aaru [0] is close, but still has to output back to other formats like BIN/CUE to use the contents of the disc.<p>[0] <a href="https://www.aaru.app/#/" rel="nofollow">https://www.aaru.app/#/</a></p>
]]></description><pubDate>Wed, 10 Jul 2024 14:07:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=40927059</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=40927059</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40927059</guid></item><item><title><![CDATA[New comment by AkBKukU in "How CD pregaps gained their hidden track superpowers"]]></title><description><![CDATA[
<p>I'm curious if you have a specific example of an album with the crowd noise between tracks like that? I collect and rip hundreds of CDs and am always on the look out for edge case discs to further hone my tools.<p>On your pregap + 99 indexes remark, the "pregap" is the space between index 00 and 01 which continues on up to index 99. Players seek to index 01 as the start point of the track. There is no separate pregap designation. I've paid special attention to this because it is a difficult problem to solve as many discs have space between tracks stored in index 00-01 but rarely is there anything audible in there after the first track. The only example I have of this is a specialty music sample disc, Rarefaction's A Poke In The Ear With A Sharp Stick, that has over 500 samples on the disc accessed by track + index positions.<p>As a sidebar based on the later comments in the thread, I've made it a habit to rip and store every audio CD as BIN/CUE+TOC using cdrdao. This allows me to go back and re-process discs I may have missed something on. But that is imprecise even because it usually breaks bluebook discs with multiple sessions to store data due to absolute LBA addressing. Also the ways different CD/DVD drives handle reading data between index 00-01 on track 1 is maddening. Some will read it, some will error, and the worst is those that output fake blank data.</p>
]]></description><pubDate>Wed, 10 Jul 2024 13:54:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=40926942</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=40926942</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40926942</guid></item><item><title><![CDATA[New comment by AkBKukU in "FCC votes to restore net neutrality rules"]]></title><description><![CDATA[
<p>I became "pro-net neutrality" back in the 2010's when Verizon was trying to charge an extra $20/mo for hot spot functionality on my provider locked android phone.<p>After some rooting and side loading I was gleefully working around that until FCC came down on them for it [1]. Net Neutrality was passed after that and only seemed like a logical response as a means of consumer protection.<p>It has always been a user facing issue, it's just not one that many people seem to want to expend the energy to think about how it impacts them. Netflix isn't using that bandwidth, the users are. Without users, Netflix would use low/no bandwidth, just as it did when it was renting DVDs. The users are paying for their own access and speeds to be able to watch netflix over the internet instead. And in turn Netflix is paying their ISP to be able to provide that data. Punishing either the users or the web hosts for finding a more effective use case for the internet than just sending static pages is the ISPs either trying to find a way to blame someone else for having over provisioned their network. Or they are trying to strong arm web hosts into paying more because they have regional monopolies and can get away with it. As a consumer if I had a choice between two ISPs and one of them throttling Netflix to try and extort them for more money, even for self centered reasons I would pick the other just to have better service. But there are a lot of areas where that isn't the case and there is a single major broadband provider who has free reign.<p>[1] <a href="https://www.cnet.com/tech/mobile/what-verizons-fcc-tethering-settlement-means-to-you-faq/" rel="nofollow">https://www.cnet.com/tech/mobile/what-verizons-fcc-tethering...</a></p>
]]></description><pubDate>Fri, 26 Apr 2024 16:48:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=40171535</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=40171535</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40171535</guid></item><item><title><![CDATA[New comment by AkBKukU in "Intel's ambitious Meteor Lake iGPU"]]></title><description><![CDATA[
<p>Not OP, but to provide some historical perspective, RTX hardware raytracing is very firmly a gimmick and it isn't AI nonsense that's going to be the end of it. It's going to go the way of PhysX, 3D Vision, and EAX audio. Cool, but complicated and not worth the effort to game devs. Game designers have to make all the lighting twice to fully implement RT, and it's just not worth the effort to them.<p>Nvidia's own site[1] lists a total of 8 Full RT compatible games, half of which they themselves helped port. There are far more games that "use" it, but only in additional to traditional lighting at the same time to minimize dev costs. Based on that and past trends, I would personally predict it to be dropped after a generation or two unless they can reuse the RT cores for something else and keep it around as a vestigial feature.<p>[1] <a href="https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/" rel="nofollow">https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-e...</a></p>
]]></description><pubDate>Wed, 10 Apr 2024 14:27:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=39991080</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39991080</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39991080</guid></item><item><title><![CDATA[New comment by AkBKukU in "U.S. sues Apple, accusing it of maintaining an iPhone monopoly"]]></title><description><![CDATA[
<p>Apple doesn't need to support it, they need to not block it and let the user decide if they want to participate in a beta.</p>
]]></description><pubDate>Thu, 21 Mar 2024 17:16:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=39781497</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39781497</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39781497</guid></item><item><title><![CDATA[New comment by AkBKukU in "The Epson HX-20 – A Contrarian's View (2021)"]]></title><description><![CDATA[
<p>I am deeply steeped in the history of computers and the biggest three things I can point to as the reason (MS-)DOS won are:<p>- Licensing: Most computers either had custom operating systems that were not shared with other hardware vendors, or in the case of BASIC frequently, were licensed themselves.<p>- IBM letting the genie out: The BIOS on the IBM PC 5150 was cloned, quickly and legally, and other companies started making compatibles. This caused an <i>explosion</i> of computer variety in a few short years for a single platform.<p>- Microsoft: DOS usually means "Microsoft DOS", Microsoft also was responsible for many of the BASIC environments of early systems as well. The ability to buy your OS from someone else lowered the pressure on hardware makers. IBM also favoured Micorsoft's DOS over CP/M-86 and stopped supporting it quickly.<p>All this meant the PC compatible ecosystem with Microsoft DOS became easy to make from a hardware side, and lacked a single point of failure like Apple, Radio Shack, Commodore. Atari, etc. There were other MS-DOS compatible DOS's out there, but MS-DOS was usually the one shipped with computers to be as "IBM compatible" as they possibly could and gained dominance through that.<p>EDIT: To those who may not be aware, BASIC did become more OS like before going away. HP BASIC was extremely feature packed before HP-UX replaced it and was more capable than MS-DOS in many ways. It evolved far beyond just a programming language.</p>
]]></description><pubDate>Tue, 12 Mar 2024 19:52:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=39684135</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39684135</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39684135</guid></item><item><title><![CDATA[New comment by AkBKukU in "Archival Floppy Disk Preservation and Use [video]"]]></title><description><![CDATA[
<p>(I am the video/page creator)<p>This is the eternal struggle of trying to write about something like this on the modern internet. The video is the flashy thing that gets attention (and revenue which allows me to do this as my job) but the written part is just talking into a void and hoping someone notices. I agree this type of information is best presented in text which is why I made an effort to produce a written component as well. But there's no way that article would have ended up linked somewhere like here.</p>
]]></description><pubDate>Sun, 25 Feb 2024 14:29:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=39501085</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39501085</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39501085</guid></item><item><title><![CDATA[New comment by AkBKukU in "Reverse engineering a forgotten 1970s Intel dual core beast: 8271 (2020)"]]></title><description><![CDATA[
<p>> One would rather expect this sort of functionality implemented in a high level operating system function<p>Almost counterintuitively, floppy drives were actually very fast compared to the CPUs early on. The DMA transfers were more to bypass the CPU than anything. For the CHS addressing some formats would implement interleave of the sectors (ie: 1,6,2,7,3,8,4,9,5). This would purposefully space sequential data apart so the CPU would have time to process it while passing over out of sequence data before encountering the next section of it. Putting more load on the CPU compounds this and was why dedicated FDC chips never went away.<p>Also fun fact,the usage of the ISA DMA interface is why you can't have full featured native floppy controllers on modern motherboards, that doesn't exist any more.</p>
]]></description><pubDate>Thu, 15 Feb 2024 13:55:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=39382711</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39382711</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39382711</guid></item><item><title><![CDATA[New comment by AkBKukU in "I accidentally Blender VSE"]]></title><description><![CDATA[
<p>One problem is Blender's multithreaded rendering doesn't scale well to VSE work because it focuses on breaking up each frame and as a result doesn't well utilize multiple cores. I've experimented with making a plugin [1] in the past to start multiple render jobs different points in the timeline in separate processes and was able to <i>massively</i> speed up renders.<p>I have since switched to Resolve on linux as well but due to using Blackmagic cameras that work better with it.<p>[1] <a href="https://github.com/AkBKukU/blenderSubprocessRender">https://github.com/AkBKukU/blenderSubprocessRender</a></p>
]]></description><pubDate>Thu, 08 Feb 2024 22:37:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=39308761</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39308761</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39308761</guid></item><item><title><![CDATA[New comment by AkBKukU in "History of Virtual Reality"]]></title><description><![CDATA[
<p>This missed <i>all</i> of the PC VR headsets from the mid-late 90s somehow which was where all of the interesting innovation was happening. PC VR and 3D was a huge market compared things like the relatively terrible VirtualBoy.<p>If you want to get your fix of probably the most influential era of VR, <a href="http://www.mindflux.com.au/index.html" rel="nofollow">http://www.mindflux.com.au/index.html</a> has been around forever as a timepiece of what was happening on the PC then.</p>
]]></description><pubDate>Mon, 05 Feb 2024 13:11:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=39260871</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39260871</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39260871</guid></item><item><title><![CDATA[New comment by AkBKukU in "Winlator: Android app that lets you to run Windows apps with Wine"]]></title><description><![CDATA[
<p>That's a you issue if a show you like gets canceled from lack of profitability due to pirating. It is your problem.</p>
]]></description><pubDate>Tue, 23 Jan 2024 13:38:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=39103135</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=39103135</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39103135</guid></item><item><title><![CDATA[New comment by AkBKukU in "YouTube demonetizes public domain 'Steamboat Willie' video after copyright claim"]]></title><description><![CDATA[
<p>> Fairness doesn’t mean that the system should be stacked in favor of anyone who uploads a video.<p>Agreed, however if a creator repeatedly violates copyright and gets the three Copyright strikes (which I recognize are distinct while will related to claims) it is deleted. For the the issuer though. there is no penalty for invalid copyright removal requests. This is the type of unfairness that is an issue. Additionally, the claim issuer needs zero proof they even have the right to file a claim. The DMCA is mostly unfair to the companies that host the content forcing them act against the uploader and have zero ability to push back against bad faith actors. So the system Google has implemented can <i>only</i> legally pass the problem onto the content creators.<p>> Do you have any evidence that a claim negatively impacts search and discovery?<p>No. Can anyone truly have a confident stance that X == Y when it comes to how Youtube presents videos to potential viewers through its black box "algorithm"? I've seen plenty of inexplicable things happen with video recommendations as both a creator and viewer that both make me question what can/can't influence and never make an absolute statement about it, hence the "likely".<p>> that sounds to me like a very pro-creator outcome<p>You glanced over the unfair part there. Having 10s of music audio in a 10m video because you walked past a restaurant while filming a conversation can cause drastically disproportionate amounts of revenue to go to the claimant. This part is Google's fault and is an overreaction erring on the side of caution to appease the claim issuers.</p>
]]></description><pubDate>Fri, 05 Jan 2024 14:51:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=38879629</link><dc:creator>AkBKukU</dc:creator><comments>https://news.ycombinator.com/item?id=38879629</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38879629</guid></item></channel></rss>