<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: akra</title><link>https://news.ycombinator.com/user?id=akra</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 17 Apr 2026 11:12:40 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=akra" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by akra in "Everyone in Seattle hates AI"]]></title><description><![CDATA[
<p>Sadly capitalism rewards scarcity at a macro level, which in some ways is the opposite of efficiency. It also grants "social status" to the scarce via more resources. As long as you aren't disrupted, and everyone in your industry does the same/colludes, restricting output and working less usually commands more money up to a certain point (prices are set more as a monopoly in these markets). Its just that scarcity was in the past correlated with difficulty which made it "somewhat fair" -> AI changes that.<p>Its why unions, associations, professional bodies, etc exist for example. This whole thread is an example -> the value gained from efficiency in SWE jobs doesn't seem to be accruing value to the people with SWE skills.</p>
]]></description><pubDate>Thu, 04 Dec 2025 10:39:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=46146042</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=46146042</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46146042</guid></item><item><title><![CDATA[New comment by akra in "Did Nvidia Just Prove There Is No AI Bubble"]]></title><description><![CDATA[
<p>There is also a chance that a lot of this capex is written off, and the money becomes "sunk". Bad for the current players, but given inference costs as you mention are profitable, after the writeoffs and the market correction the industry continues on variable inference revenue.<p>The catch is you probably only want to be invested after any writeoffs/corrections if that is your hypothesis. i.e. the future may be AI, but it isn't a straight line, nor is it guaranteed that the current players will be the future AI company of choice. You can be right about the end state and still lose your shirt in between with markets.</p>
]]></description><pubDate>Mon, 01 Dec 2025 02:44:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=46102950</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=46102950</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46102950</guid></item><item><title><![CDATA[New comment by akra in "The Programmer Identity Crisis"]]></title><description><![CDATA[
<p>I think you can enjoy both aspects - both the problem solving and the craft. There will be people who agree that of course from a rational perspective solving the problem is what matters, but for them personally the "fun" is gone. Generally people that identify themselves as "programmers" as the article does would be the people who enjoy problem solving/tinkering/building.</p>
]]></description><pubDate>Tue, 21 Oct 2025 20:25:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=45661222</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=45661222</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45661222</guid></item><item><title><![CDATA[New comment by akra in "Claude Code is all you need"]]></title><description><![CDATA[
<p>In my view its a tool, at least for the moment. Learn it, work out how it works for you, and what it doesn't work for you. But assuming you are the professional they should trust your judgement, and you should also earn that trust. That's why you pay skilled people for. If that tool isn't the best to getting the job done use something else. Of course that professional should be evaluating tools and assuring us/management (whether by evidence or other means) that the most cost efficient and quality product is being built like any other profession.<p>I use AI, and for some things its great. But I'm feeling like they want us to use the "blunt instrument" that is AI when sometimes a smaller, more fine grained tool/just handcrafting code for accuracy at least for me is quicker and more appropriate. The autonomy window as I recently heard it expressed.</p>
]]></description><pubDate>Tue, 12 Aug 2025 03:31:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=44872087</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=44872087</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44872087</guid></item><item><title><![CDATA[New comment by akra in "A.I. researchers are negotiating $250M pay packages"]]></title><description><![CDATA[
<p>I think the reason for the negativity in this forum (and other threads I've seen over the past few months) is because people are engaged with AI and it seems are deep down not happy with its direction even if they are forced to adapt. That negativity spreads I think to people winning in this which is common in human nature. At least that's the impression I'm getting here and other places. The most commented articles on HN these days are AI (e.g. OpenAI model, some blogger writing about Claude Code gets 500+ comments, etc) which shows a very high level of emotional engagement and have the typical offensive and defensive attitude between people that benefit or lose from this. Also general old school software tech articles are drowned out in comparison; AI is taking all the oxygen out of the room.<p>My anecdotal observation talking to people: Most tech cycles I've seen have hype/excitement but this is the first one I've been in at least that I've seen a large amount of fear/despair. From loss of jobs, automating all the "good stuff", enriching only the privileged, etc etc people are worried. As loss aversion animals fear is usually more effective for engagement especially if it means a loss of what was before - people are engaged but I suspect negative towards the whole AI thing in general even if they won't say it on the record. Fear also creates a singular focus; when you are threatened/anxious its harder for people to engage with other topics and makes you see AI trend as something you would want to see fail. That paints AI researchers as not just negative; but almost changing their own profession/world for the worse which doesn't elicit a positive response from people.<p>And for the others, even if they don't have this engagement, the fact that this is drowning out other things can be annoying to some tech workers as well. Other tech talks, articles, research, etc is just silent in comparison.<p>YMMV; this is just my current anecdotal observations in my limited circle but I suspect others are seeing the same.</p>
]]></description><pubDate>Mon, 04 Aug 2025 02:31:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=44781616</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=44781616</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44781616</guid></item><item><title><![CDATA[New comment by akra in "6 weeks of Claude Code"]]></title><description><![CDATA[
<p>The question is not whether you can or can't, but whether it is still worth it long term:<p>- There is a moat of doing so (i.e. will people actually pay for your SaaS knowing that they could do it too via AI) and..<p>- How many large scale ideas do you need post AI? Many SaaS products are subscription based and loaded with features you don't need. Most people would prefer a simple product that just does what they need without the ongoing costs.<p>There will be more software. The question is who accrues the economic value of this additional software - the SWE/tech industry (incumbent), the AI industry (disruptor?) and/or the consumer. For the SWE's/tech workers it probably isn't what they envisioned when they started/studied for this industry.</p>
]]></description><pubDate>Sun, 03 Aug 2025 05:02:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=44774185</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=44774185</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44774185</guid></item><item><title><![CDATA[New comment by akra in "What if AI made the world’s economic growth explode?"]]></title><description><![CDATA[
<p>I sadly think if the promise of AI happens this is the likely economic outcome. The last century or so was an anomaly from most of human history; a trend created by the "arms race" of needing educated workers. The prisoner's dilemma was if you trained your workers in more efficient tech you could out-compete and take all the profits from competitors which gave those educated workers the means to strike (i.e. leverage). Now it is the "arms race" of educated AI, rather than workers which could invalidate a lot of assumptions our current society takes for granted in its structure.</p>
]]></description><pubDate>Sun, 27 Jul 2025 06:54:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=44699370</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=44699370</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44699370</guid></item><item><title><![CDATA[New comment by akra in "Intel CEO Letter to Employees"]]></title><description><![CDATA[
<p>That's what AI does. Makes power and politics have even more of a premium vs say learning, intelligence and hard work. Connections, wealth and power. It is almost ironic that our industry is inventing the thing that empowers the people that techies often find useless (as per the above comments) and dis-empowering themselves often shutting the door behind them.<p>Yes an AI will come up with more insight than many management people as many people state in this thread that a LLM can do their job. Its a mistake to assume that's what they are paid for however.</p>
]]></description><pubDate>Fri, 25 Jul 2025 22:37:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=44689321</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=44689321</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44689321</guid></item><item><title><![CDATA[New comment by akra in "LLM-powered tools amplify developer capabilities rather than replacing them"]]></title><description><![CDATA[
<p>Many tech companies and/or training places did try to though didn't they? I know they do boot camps, coding classes in schools and a whole bunch of other initiatives to get people into the industry. Teaching kids and adults coding skills has been attempted; the issue is more IMO that not everyone has the interest and/or aptitude to continue with it. The problem is that there's parts of the industry/job that aren't actually easy to teach (note not all of it); can be quite stressful and require constant keeping up - IMO if you don't love it you won't stick with it. As software demand grows, despite the high salaries (particularly in the US) and training, supply didn't keep up with demand till recently.<p>In any case I'm not saying I think they will achieve it, or achieve it soon - I don't have that foresight. I'm just elaborating on their implied stated goals; they don't state them directly but reading their announcements on their models, code tools, etc that's IMO their implied end game. Anthrophic recently announced statistics that most of their model usage is for coding. Thinking it is just augmentation doesn't justify the money IMO put into these companies by VC's, funds, etc - they are looking for bigger payoffs than that remembering that many of these AI companies aren't breaking even yet.<p>I was replying the the parent comment - augmentation and/or copilots don't seem to be their end game/goal. Whether they are actually successful is another story.</p>
]]></description><pubDate>Wed, 23 Apr 2025 05:07:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=43768783</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43768783</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43768783</guid></item><item><title><![CDATA[New comment by akra in "LLM-powered tools amplify developer capabilities rather than replacing them"]]></title><description><![CDATA[
<p>Agree with most of what you said except for the "big bucks" part. Why would I pay for your product when I can ask the AI to do it? To be honest I think I would rather use that money for anything else if I can spend a little bit of time and get the AI to do it. This is quite deflationary for programming in general and inflationary for domains not disrupted all else being equal. There's a point where Jevon's Paradox fails - after all there's only so much software most normal people want and at that point tech workers value relative to other sectors will decline assuming unequal disruption.<p>The ability to earn the big bucks as you state is not a function of the value delivered/produced, but the scarcity and difficulty in acquiring said value. That is capitalism. An extreme example is clear air that we breathe - it is currently free, but extremely valuable to most living things. If we made it scarce (e.g. pollution) eventually people would start charging for it; potentially at extortionary prices depending on how rare it becomes.<p>The only exception I see is if the software encodes a domain that isn't as accessible to people and is kept secret/under wraps, has natural protections (e.g. a government system that is mandatory to use), or is complex and still requires co-ordination and understanding. This does happen, but then I would argue the value is in the adjacent domain knowledge - not in the software itself.</p>
]]></description><pubDate>Tue, 22 Apr 2025 02:43:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=43758627</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43758627</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43758627</guid></item><item><title><![CDATA[New comment by akra in "LLM-powered tools amplify developer capabilities rather than replacing them"]]></title><description><![CDATA[
<p>That's kinda obvious that's their goal especially with the current focus on coding of most of the AI labs in most announcements - it may be augmentation now but that isn't the end game. Everything else these AI labs do, while fun seems like at most a "meme" to most people in relative terms.<p>Most Copilot style setup's (not just in this domain) are designed to gather data and train/gather feedback before full automation or downsizing. If they outright said it they may not have got the initial usage needed to do so from developers. Even if it is augmentation it feels like at least to me the other IT roles (e.g. BA's, Solution Engineers maybe?) are safer than SWE's going forward. Maybe its because dev's have a skin in the game and without AI its not that easy of a job over time makes it harder for them to see. Respect for SWE as a job in general has fallen in at least my anecdotal conversations mainly due to AI - after all long term career prospects are a major factor in career value, social status and personal goals for most people.<p>Their end goal is to democratize/commoditize programming with AI as low hanging fruit which by definition reduces its value per unit of output. The fact that there is so much discussion on this IMO shows that many even if they don't want to admit it there is a decent chance that they will succeed at this goal.</p>
]]></description><pubDate>Mon, 21 Apr 2025 21:04:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=43756461</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43756461</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43756461</guid></item><item><title><![CDATA[New comment by akra in "AI 2027"]]></title><description><![CDATA[
<p>I'm not sure where construction and physical work goes into your categories. Process and chores maybe. But I think AI will struggle in the physical domain - validation is difficult and repeated experiments to train on are either too risky, too costly or potentially too damaging (i.e. in the real world failure is often not an option unlike software where test benches can allow controlled failure in a simulated env).</p>
]]></description><pubDate>Sat, 05 Apr 2025 05:58:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=43591200</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43591200</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43591200</guid></item><item><title><![CDATA[New comment by akra in "AI 2027"]]></title><description><![CDATA[
<p>This is what I think as well. Unfortunately for the AI proponents they already made an example of the software industry. Its on news reports in the US and globally; most people are no longer recommending to get into the industry, etc. Software for better or worse has made an example for other industries as to what "not to do" both w.r.t data (online and option), and culture (e.g. open source, open tests, etc).<p>Anecdotally most people I know are against AI - they see more negatives from it than positives. Reading things like this just reinforces that belief.<p>The question of why are we even doing this? Why did we invent this? etc. Most people aren't interested in creating a "worthy successor" at best that eliminates them and potentially their children seeing that goal as nothing but naive and dare I say it wrong. All these thoughts will come from reading the above for most people.</p>
]]></description><pubDate>Sat, 05 Apr 2025 05:50:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=43591163</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43591163</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43591163</guid></item><item><title><![CDATA[New comment by akra in "Qwen2.5-VL-32B: Smarter and Lighter"]]></title><description><![CDATA[
<p>I'm not saying there is a centralised force - I didn't say the government per se. Its enough to say many of the models coming out of China - the AI portion isn't their main income source especially for the major models that people are hyping up (Qwen, DeepSeek, etc). This model (Qwen) from Alibaba is a side model more likely complimenting their main business and cloud offerings. DeepSeek started as a way to use AI for trading models firstly; then spun up on the side. I'm more speaking about China's general position - for them AI seems to be more of a compliment than the main business as compared say to the major AI labs in America (ex Google). My opinion is that robotics in particular just extends that going forward.<p>Given as you say the long term cost of AI models is marginally zero, I don't think this is a bad position to be in.</p>
]]></description><pubDate>Wed, 26 Mar 2025 19:53:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=43486411</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43486411</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43486411</guid></item><item><title><![CDATA[New comment by akra in "Qwen2.5-VL-32B: Smarter and Lighter"]]></title><description><![CDATA[
<p>This is the reason IMO. Fundamentally China right now is better at manufacturing (e.g. robotics). AI is the complement to this - AI increases the demand for tech manufactured goods. Whereas America is in the opposite position w.r.t which side is their advantage (i.e. the software). AI for China is an enabler into a potentially bigger market which is robots/manufacturing/etc.<p>Commoditizing the AI/intelligence part means that the main advantage isn't the bits - its the atoms. Physical dexterity, social skills and manufacturing skills will gain more of a comparative advantage vs intelligence work in the future as a result - AI makes the old economy new again in the long term. It also lowers the value of AI investments in that they no longer can command first mover/monopoly like pricing for what is a very large capex cost undermining US investment in what is their advantage. As long as it is strategic, it doesn't necessarily need to be economic on its own.</p>
]]></description><pubDate>Mon, 24 Mar 2025 19:53:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=43464765</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=43464765</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43464765</guid></item><item><title><![CDATA[New comment by akra in "Stargate Project: SoftBank, OpenAI, Oracle, MGX to build data centers"]]></title><description><![CDATA[
<p>Just my opinion/observation really but I believe its because people are implicitly entertaining the possibility that it is no longer about software or rather this announcement implicitly states that talent long term isn't the main advantage but instead hardware, compute, etc and most importantly the wealth and connections to gain access to large sums of capital. AI will enable capital/wealthy elite to have more of an advantage over human intelligence/ingenuity which I think is not typically what most hacker/tech forums are about.<p>For example it isn't what you can do tinkering in your home/garage anymore; or what algorithm you can crack with your intrinsic worth to create more use cases and possibilities - but capital, relationships, hardware and politics. A recent article that went around, and many others are believing capital and wealth will matter more and make "talent" obsolete in the world of AI - this large figure in this article just adds money to that hypothesis.<p>All this means the big get bigger. It isn't about startup's/grinding hard/working hard/being smarter/etc which means it isn't really meritocratic. This creates an uneven playing field that is quite different than previous software technology phases where the gains/access to the gains has been more distributed/democratized and mostly accessible to the talented/hard working (e.g. the risk taking startup entrepreneur with coding skills and a love of tech).<p>In some ways it is kind of the opposite of the indy hacker stereotype who ironically is probably one of the biggest losers in the new AI world. In the new world what matters is wealth/ownership of capital, relationships, politics, land, resources and other physical/social assets. In the new AI world scammers, PR people, salespeople, politicians, ultra wealthy with power etc thrive and nepotism/connections are the main advantage. You don't just see this in AI btw (e.g. recent meme coins seen as better path to wealth than working due to weak link to power figure), but AI like any tech amplifies the capability of people with power especially if by definition the powerful don't need to be smart/need other smart people to yield it unlike other tech in the past.<p>They needed smart people in the past; we may be approaching a world where the smart people make themselves as a whole redundant. I can understand why a place like this doesn't want that to succeed, even if the world's resources are being channeled to that end. Time will tell.</p>
]]></description><pubDate>Wed, 22 Jan 2025 20:35:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=42797199</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=42797199</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42797199</guid></item><item><title><![CDATA[New comment by akra in "OpenAI O3 breakthrough high score on ARC-AGI-PUB"]]></title><description><![CDATA[
<p>More likely they will tailor/RL train these models to go after coders first. Use RLHF employing coders where labor is cheap to train their models. A number of reasons for this of course:<p>- Faster product development on their side as they eat their own dogfood<p>- Dev's are the biggest market in the transition period for this tech. Gives you some revenue from direct and indirect subscriptions that the general population does not need/require.<p>- Fear in leftover coders is great for marketing<p>- Tech workers are paid well which to VC's, CEO's, etc makes it obvious where the value of this tech comes from. Not with new use cases/apps which would be greatly beneficial to society - but effectively making people redundant saving costs. New use cases/new markets are risky; not paying people is something any MBA/accounting type can understand.<p>I've heard some people say "its like they are targeting SWE's". I say; yes they probably are. I wouldn't be surprised if it takes SWE jobs but otherwise most people see it as a novelty (barely affects their life) for quite some time.</p>
]]></description><pubDate>Sun, 22 Dec 2024 06:09:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=42484684</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=42484684</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42484684</guid></item><item><title><![CDATA[New comment by akra in "OpenAI O3 breakthrough high score on ARC-AGI-PUB"]]></title><description><![CDATA[
<p>Its a common view from the "do'ers" (the people who made most of the value in the past; the hard workers, etc) that this will make management redundant. Sadly with a basic understanding of economics you can see this is probably wrong. The "do'ers" have given more power to the management class at their own expense with this solution - if I can get the AI to "do" all I need are the people who "decide what to do". Market power belongs with scarcity - all else being equal AI makes the barrier to development smaller meaning less scarcity on that side. In general technology developments have increased inequality especially since the 90's onwards.<p>Generally with AI think the top of society stand to gain a lot more than the middle/bottom of it for a whole host of reasons. If you think anything different your framework you use to make your conclusion is probably wrong at least in IMO.<p>I don't like saying this but there is a reason why the "AI bros", VC's, big tech CEO's, etc are all very very excited about this and many employees (some commenting here) are filled with dread/fear. The sales people, the managers, the MBA's, etc stand to gain a lot from this. Fear also serves as the best marketing tool; it makes people talk and spread OpenAI's news more so than everything else. Its a reason why targeting coding jobs/any jobs is so effective. I want to be wrong of course.</p>
]]></description><pubDate>Sun, 22 Dec 2024 04:55:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=42484467</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=42484467</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42484467</guid></item><item><title><![CDATA[New comment by akra in "OpenAI O3 breakthrough high score on ARC-AGI-PUB"]]></title><description><![CDATA[
<p>The default position is that it will decrease the need of what the middle class can offer (skilled labor). All else being equal that increases the value of the other factors of production (the next bottleneck) such as capital and land.<p>Unless something changes, if I was a billionaire I would be ecstatic at the moment. Now even the impossible seems potentially possible if this delivers on its promises (e.g. go to Mars, build a utopia for my inner circle, etc). I no longer need other people to have everything. Previously there was no point in money if I didn't have a place to spend it/people to accept it. Now with real assets I can use AI/machines to do what I want - I no longer need "money" or more accurately other people to live a very wealthy life.<p>Again this is all else being equal. Lots of other things could change, but with increasing surveillance by use of technology I doubt large revolutions/etc will ever get the chance to get off the ground or have the scale to be effective.<p>Interesting times.</p>
]]></description><pubDate>Sun, 22 Dec 2024 04:50:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=42484456</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=42484456</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42484456</guid></item><item><title><![CDATA[New comment by akra in "OpenAI O3 breakthrough high score on ARC-AGI-PUB"]]></title><description><![CDATA[
<p>Money buys real assets which will be worth something; AI can't magic up land or energy for instance. In fact AI is a dream for capital, and a nightmare for labor/work/human intelligence w.r.t value.</p>
]]></description><pubDate>Sun, 22 Dec 2024 04:46:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=42484439</link><dc:creator>akra</dc:creator><comments>https://news.ycombinator.com/item?id=42484439</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42484439</guid></item></channel></rss>