• 0 Yorumlar ·0 hisse senetleri ·44 Views
  • The Worm That No Computer Scientist Can Crack
    www.wired.com
    One of the simplest, most over-studied organisms in the world is the C. elegans nematode. For 13 years, a project called OpenWorm has triedand utterly failedto simulate it.
    0 Yorumlar ·0 hisse senetleri ·42 Views
  • The Best Programming Language for the End of the World
    www.wired.com
    Once the grid goes down, an old programming language called Forthand a new operating system called Collapse OSmay be our only salvation.
    0 Yorumlar ·0 hisse senetleri ·46 Views
  • Reflecting on TikToks Role in Society as New Ban Deadline Approaches
    www.nytimes.com
    With a national ban unlikely, lets reflect on how the app both sparks joy among users and raises mental health concerns.
    0 Yorumlar ·0 hisse senetleri ·64 Views
  • The secret to using generative AI effectively
    www.computerworld.com
    Do you think generative AI (genAI) sucks? I did. The hype around everything genAI has been over the top and ridiculous for a while now. Especially at the start, most of the tools were flashy, but quickly fell apart if you tried to use them for any serious work purposes.When ChatGPT started really growing in early 2023, I turned against it hard. It wasnt just a potentially interesting research product. It was a bad concept getting shoved into everything.Corporate layoffs driven by executives who loved the idea of replacing people with unreliable robots hurt a lot of workers. They hurt a lot of businesses, too. With the benefit of hindsight, we can now all agree: genAI, in its original incarnation, just wasnt working.At the end of 2023, I wrote about Microsofts then-new Copilot AI chatbot and summed it up as a storyteller a chaotic creative engine thats been pressed into service as a buttoned-up virtual assistant, [with] the seams always showing.Youd probably use it wrong, as I noted at the time. Even if you used it right, it wasnt all that great. It felt like using a smarter autocomplete.Much has changed. At this point in 2025, gen AI tools can actually be useful but only if you use them right. And after much experimentation and contemplation, I think Ive found the secret.Ready to turn up your Windows productivity with and without AI? Sign up for my free Windows Intelligence newsletter. Ill send you free Windows Field Guides as a special welcome bonus!The power of your internal dialogueSo here it is: To get the best possible results from genAI, you must externalize your internal dialogue. Plain and simple, AI models work best when you give them more information and context.Its a shift from the way were accustomed to thinking about these sorts of interactions, but it isnt without precedent. When Google itself first launched, people often wanted to type questions at it to spell out long, winding sentences. That wasnt how to use the search engine most effectively, though. Google search queries needed to be stripped to the minimum number of words.GenAI is exactly the opposite. You need to give the AI as much detail as possible. If you start a new chat and type a single-sentence question, youre not going to get a very deep or interesting response.To put it simply: You shouldnt be prompting genAI like its still 2023. You arent performing a web search. You arent asking a question.Instead, you need to be thinking out loud. You need to iterate with a bit of back and forth. You need to provide a lot of detail, see what the system tells you then pick out something that is interesting to you, drill down on that, and keep going.You are co-discovering things, in a sense. GenAI is best thought of as a brainstorming partner. Did it miss something? Tell it maybe youre missing something and it can surface it for you. The more you do this, the better the responses will get.Its actually the easiest thing in the world. But its also one of the hardest mental shifts to make.Lets take a simple example: Youre trying to remember a word, and its on the tip of your tongue. You cant quite remember it, but you can vaguely describe it. If you were using Google to find the word, youd have to really think about how to craft the perfect search term.In that same scenario, you could rely on AI with a somewhat rambling, conversational prompt like this:Whats the word for a soft kind of feeling you get its warm, but a little cold. Its sad, but thats not quite right. You miss something, but youre happy you miss it. Its not melancholy, thats wrong, thats too sad. I dont know. It reminds me of walking home from school on a sunny fall afternoon. The sun is setting and you know it will be winter soon, and you miss summer, and you know its over, but youre happy it happened.And the genAI might respond: wistful. Thats your answer. More likely, the tool will return a list of possible words. It might not magically know you meant wistful right away but you will know the moment you see the word within its suggestions.This is admittedly an overwrought example. A shorter description of the word its kind of like this, and its kind of like that would also likely do the trick.Ramble onThe best way to sum up this strategy is simple: You need to ramble.Try this, as an experiment: Open up the ChatGPT app on your Android or iOS phone and tap the microphone button at the right side of the chat box. Make sure youre using the microphone button and not the voice chat mode button, which does not let you do this properly.(Amusingly enough, the ChatGPT Windows app doesnt support this style of voice input, and Microsofts Copilot app doesnt, either. This shows that the companies building this type of product dont really understand how its best used. If you want to ramble with your voice, youll need to use your phone or ramble by typing on your keyboard.)This is the easiest way to get started with true stream-of-consciousness rambling.Chris Hoffman, FoundryAfter you tap the microphone button, ramble at your phone in a stream-of-consciousness style. Lets say you want TV show recommendations. Ramble about the shows you like, what you think of them, what parts you like. Ramble about other things you like that might be relevant or that might not seem relevant! Think out loud. Seriously talk for a full minute or two. When youre done, tap the microphone button once more. Your rambling will now be text in the box. Your ums and speech quirks will be in there, forming extra context about the way you were thinking. Do not bother reading it if there are typos, the AI will figure it out. Click send. See what happens.Just be prepared for the fact that ChatGPT (or other tools) wont give you a single streamlined answer. It will riff off what you said and give you something to think about. You can then seize on what you think is interesting when you read the response, you will be drawn to certain things. Drill down, ask questions, share your thoughts. Keep using the voice input if it helps. Its convenient and helps you really get into a stream-of-consciousness rambling state.Did the response you got fail to deliver what you needed? Tell it. Say you were disappointed because you were expecting something else. Say youve already watched all those shows and you didnt like them. That is extra context. Keep drilling down.You dont have to use voice input, necessarily. But, if youre typing, you need to type like youre talking to yourself with an inner dialogue, stream-of-consciousness style, as if you were speaking out loud. If you say something that isnt quite right, dont hit backspace. Keep going. Say: That wasnt quite right I actually meant something more like this other thing.The beauty of back-and-forthLets say you want to use genAI to brainstorm the perfect marketing tagline for a campaign. Youd start by rambling about your project, or maybe just speaking a shorter prompt. Ask for a bunch of rough ideas so you can start contemplating and take it from there.But then, critically, you keep going. You say you like a few ideas in particular and want to go more in that direction. You get some more possibilities back. You keep going, on and on Well, I like the third one, but I think it needs more of [something], and the sixth one is all right but [something else]. Keep talking, postulating, refining, following paths of concepts to something that feels more right to you.If the tool doesnt seem to be on the right wavelength, dont get frustrated and back out. Tell it: No, you dont understand. This is for a major clothing company. I need it to sound professional but also catch peoples eyes. Thats why your suggestions are all too much.In a similar way to how the long stream-of-consciousness ramble lays a lot of context to push genAI in a useful direction this back-and-forth lays a lot of context as groundwork. Your entire conversation until that point forms the scaffolding of the conversation and affects the future responses in the thread. As you keep adding onto and continuing the conversation, you can make it more attuned to what youre looking for.Crucially, genAI is not making decisions. You are making all the decisions. You are exercising the taste. You can push it in this or that direction to get ideas. If it lands on something you disagree with, you can push back: No, thats not right at all. We really got off track. How about?Is this silly? Well, brainstorming doesnt normally mean sitting in an empty room meditating while staring at paint drying. It often means searching Google, seeing what other people say, poking around for inspiration. This can be similar but faster.Maybe you still use Google for brainstorming sometimes or go for a walk and be alone with your thoughts! Thats fine, too. GenAI is meant to be another tool in your toolbox. It isnt meant to be the end-all answer.The bigger AI pictureTo be clear: Im not here to sell you on the idea of embracing genAI. Im here to tell you that companies peddling these tools right now are selling you the wrong thing. The way they talk about the technology is not how you should use it. Its no wonder so many smart people are bouncing off it and being rightfully critical of what were being sold.GenAI should not be a replacement for thinking. More than anything, it is a tool for exploring concepts and the connections between them. You can use it to write a better email. You can use it to put together a marketing plan. It will do things you dont expect, and thats the point.Yes, it might hallucinate and make things up. (Thats why you need to keep your brain engaged.) You might want to just opt out. You might decided to keep plugging away looking for answers. Just remember: If youre using genAI, try to use it to be more human, not less. That will help you write better emails and accomplish much more beyond that.Lets stay in touch! Sign up for my free Windows Intelligence newsletter today. Ill send you three new things to try each Friday.
    0 Yorumlar ·0 hisse senetleri ·59 Views
  • Will Microsoft be laid low by the feds antitrust probe?
    www.computerworld.com
    Microsoft is on top of the world right now, riding its AI dominance to become the worlds second-most valuable company,worth somewhere in the vicinity of $3 trillion, depending on the days stock price.But that could easily change and not because competitors have found a way to topple it as king of AI.A federal antitrust investigation threatens to do to the company what was done to it 35 years ago by a US Justice Department suit that tumbled the company from its perch as the worlds top tech company. It also led to a lost decade in which Microsoft lagged in the technologies that would transform the world the internet and the rise of mobile.The current investigation waslaunched last year by the Federal Trade Commission (FTC) under the leadership of Chair Lina Khan. Khan was ousted by President Donald J. Trump when he re-took office in January, and theres been a great deal of speculation about whether his administration would kill the investigation or let it proceed.That speculation ended this month, when new FTC Chair Andrew Fergusonasked the company for a boatload of information about its AI operationsdating back to 2016, including detailed requests about its training models and how it acquires the data for them.The investigation isnt just about AI. It also covers Microsofts cloud operations, cybersecurity efforts, productivity software, Teams, licensing practices, and more. In other words, just about every important part of the company.More details about the investigationAlthough the investigation is a broad one, the most consequential parts focus on the cloud, AI, and the companys productivity suite, Microsoft 365. It will probably dig deep into the way Microsoft uses its licensing practices to push or force businesses to use multiple Microsoft products.Heres how The New York Times describes it: Of particular interest to the FTC is the way Microsoft bundles its cloud computing offerings with office and security products.The newspaper claims the investigation is looking at how Microsoft locks customers into using its cloud services by changing the terms under which customers could use products like Office. If the customers wanted to use another cloud provider instead of Microsoft, they had to buy additional software licenses and effectively pay a penalty.Thats long been a complaint about the way the company does business. European Union regulators last summer charged thatMicrosoft broke antitrust laws by the way it bundles Teams into its Microsoft 365 productivity suite. Teams rivals like Zoom and Slack dont have the ability to be bundled like that, the EU says, giving Microsoft an unfair advantage. Microsoft began offering some versions of the suite without Teams,but an EU statement about the suitsays the EU preliminarily finds that these changes are insufficient to address its concerns and that more changes to Microsofts conduct are necessary to restore competition.AI is a target, tooMicrosofts AI business is also in the legal crosshairs, though very few details have come out about it. However, at least part of the probe will likely center on whether Microsofts close relationship with OpenAI violates antitrust laws by giving the company an unfair market dominance.The investigation could also focus on whether Microsoft uses its licensing practices for Microsoft 365 and Copilot, its generative AI chatbot, in ways that violate antitrust laws.In a recent column, I wrote that Microsoft now forces customers of the consumer version of Microsoft 365 to pay an additional fee for Copilot even if they dont want it. In January, Microsoft bundled Copilot into the consumer version of Microsoft 365 and raised prices on the suite by $3 per month or $30 for the year. Consumers are given no choice if they want Microsoft 365, theyll have to pay for Copilot, whether they use it or not.Microsoft also killed two useful features in all versions of Microsoft 365, for consumers as well as businesses, and did it in a way to force businesses to subscribe to Copilot. The features allowed users to do highly targeted searches from within the suite. Microsoft said people could instead use Copilot to do that kind of searching.(In fact, Copilot cant match the features Microsoft killed.) But business and educational Microsoft 365 users dont get Copilot bundled in, so theyll have to pay an additional $30 per user per month if they want the search features, approximately doubling the cost of the Office suite.Expect the feds to file suitIts almost certain the feds will file at least one suit against Microsoft by the FTC, the Justice Department, or maybe both. After all, federal lawsuits against Amazon, Apple, Google, and Meta launched by the Biden administration have been continued under Trump. Theres no reason to expect he wont target Microsoft as well.Theres another reason the feds could hit Microsoft hard. Elon Musk is suing OpenAI and Microsoft, claiming their relationship violates antitrust laws. Hes also spending billions to compete against them. Given that hes essentially Trumps co-president as well as being Trumps most important tech advisor its pretty much a slam dunk that one more federal suit will be filed.As one piece of evidence that suits are coming, theFTC weighed in on Musks side in his suit against the company and OpenAI, saying antitrust laws support his claims. In a wink-wink, nudge-nudge claim that no one believes, the agency says its not taking sides in the Musk lawsuit.The upshotExpect the investigations into Microsoft to culminate in one or more suits filed against the company. After that, its anyones guess what might happen. The government could ask that Microsoft be broken into pieces perhaps lopping off its AI arm. It could even ask that the cloud as well as AI be turned into their own businesses. Or it could go a softer route by fining the company billions of dollars and forcing it to change its business practices.Either way, hard times are likely ahead for Microsoft. The big question will be whether CEO Satya Nadella can weather the turbulence better than Bill Gates and Steve Ballmer did when the previous federal suit against the company laid it low for a decade.
    0 Yorumlar ·0 hisse senetleri ·62 Views
  • China built hundreds of AI data centers to catch the AI boom. Now many stand unused.
    www.technologyreview.com
    A year or so ago, Xiao Li was seeing floods of Nvidia chip deals on WeChat. A real estate contractor turned data center project manager, he had pivoted to AI infrastructure in 2023, drawn by the promise of Chinas AI craze.At that time, traders in his circle bragged about securing shipments of high-performing Nvidia GPUs that were subject to US export restrictions. Many were smuggled through overseas channels to Shenzhen. At the height of the demand, a single Nvidia H100 chip, a kind that is essential to training AI models, could sell for up to 200,000 yuan ($28,000) on the black market.Now, his WeChat feed and industry group chats tell a different story. Traders are more discreet in their dealings, and prices have come back down to earth. Meanwhile, two data center projects Li is familiar with are struggling to secure further funding from investors who anticipate poor returns, forcing project leads to sell off surplus GPUs. It seems like everyone is selling, but few are buying, he says.Just months ago, a boom in data center construction was at its height, fueled by both government and private investors. However, many newly built facilities are now sitting empty. According to people on the ground who spoke to MIT Technology Reviewincluding contractors, an executive at a GPU server company, and project managersmost of the companies running these data centers are struggling to stay afloat. The local Chinese outlets Jiazi Guangnian and 36Kr report that up to 80% of Chinas newly built computing resources remain unused.Renting out GPUs to companies that need them for training AI modelsthe main business model for the new wave of data centerswas once seen as a sure bet. But with the rise of DeepSeek and a sudden change in the economics around AI, the industry is faltering.The growing pain Chinas AI industry is going through is largely a result of inexperienced playerscorporations and local governmentsjumping on the hype train, building facilities that arent optimal for todays need, says Jimmy Goodrich, senior advisor for technology at the RAND Corporation.The upshot is that projects are failing, energy is being wasted, and data centers have become distressed assets whose investors are keen to unload them at below-market rates. The situation may eventually prompt government intervention, he says: The Chinese government is likely to step in, take over, and hand them off to more capable operators.A chaotic building boomWhen ChatGPT exploded onto the scene in late 2022, the response in China was swift. The central government designated AI infrastructure as a national priority, urging local governments to accelerate the development of so-called smart computing centersa term coined to describe AI-focused data centers.In 2023 and 2024, over 500 new data center projects were announced everywhere from Inner Mongolia to Guangdong, according to KZ Consulting, a market research firm. According to the China Communications Industry Association Data Center Committee, a state-affiliated industry association, at least 150 of the newly built data centers were finished and running by the end of 2024. State-owned enterprises, publicly traded firms, and state-affiliated funds lined up to invest in them, hoping to position themselves as AI front-runners. Local governments heavily promoted them in the hope theyd stimulate the economy and establish their region as a key AI hub.However, as these costly construction projects continue, the Chinese frenzy over large language models is losing momentum. In 2024 alone, over 144 companies registered with the Cyberspace Administration of Chinathe countrys central internet regulatorto develop their own LLMs. Yet according to the Economic Observer, a Chinese publication, only about 10% of those companies were still actively investing in large-scale model training by the end of the year.Chinas political system is highly centralized, with local government officials typically moving up the ranks through regional appointments. As a result, many local leaders prioritize short-term economic projects that demonstrate quick resultsoften to gain favor with higher-upsrather than long-term development. Large, high-profile infrastructure projects have long been a tool for local officials to boost their political careers.The post-pandemic economic downturn only intensified this dynamic. With Chinas real estate sectoronce the backbone of local economiesslumping for the first time in decades, officials scrambled to find alternative growth drivers. In the meantime, the countrys once high-flying internet industry was also entering a period of stagnation. In this vacuum, AI infrastructure became the new stimulus of choice.AI felt like a shot of adrenaline, says Li. A lot of money that used to flow into real estate is now going into AI data centers.By 2023, major corporationsmany of them with little prior experience in AIbegan partnering with local governments to capitalize on the trend. Some saw AI infrastructure as a way to justify business expansion or boost stock prices, says Fang Cunbao, a data center project manager based in Beijing. Among them were companies like Lotus, an MSG manufacturer, and Jinlun Technology, a textile firmhardly the names one would associate with cutting-edge AI technology.This gold-rush approach meant that the push to build AI data centers was largely driven from the top down, often with little regard for actual demand or technical feasibility, say Fang, Li, and multiple on-the-ground sources, who asked to speak anonymously for fear of political repercussions. Many projects were led by executives and investors with limited expertise in AI infrastructure, they say. In the rush to keep up, many were constructed hastily and fell short of industry standards.Putting all these large clusters of chips together is a very difficult exercise, and there are very few companies or individuals who know how to do it at scale, says Goodrich. This is all really state-of-the-art computer engineering. Id be surprised if most of these smaller players know how to do it. A lot of the freshly built data centers are quickly strung together and dont offer the stability that a company like DeepSeek would want.To make matters worse, project leaders often relied on middlemen and brokerssome of whom exaggerated demand forecasts or manipulated procurement processes to pocket government subsidies, sources say.By the end of 2024, the excitement that once surrounded Chinas data center boom was curdling into disappointment. The reason is simple: GPU rental is no longer a particularly lucrative business.The DeepSeek reckoningThe business model of data centers is in theory straightforward: They make money by renting out GPU clusters to companies that need computing capacity for AI training. In reality, however, securing clients is proving difficult. Only a few top tech companies in China are now drawing heavily on computing power to train their AI models. Many smaller players have been giving up on pretraining their models or otherwise shifting their strategy since the rise of DeepSeek, which broke the internet with R1, its open-source reasoning model that matches the performance of ChatGPT o1 but was built at a fraction of its cost.DeepSeek is a moment of reckoning for the Chinese AI industry. The burning question shifted from Who can make the best large language model? to Who can use them better? says Hangcheng Cao, an assistant professor of information systems at Emory University.The rise of reasoning models like DeepSeeks R1 and OpenAIs ChatGPT o1 and o3 has also changed what businesses want from a data center. With this technology, most of the computing needs come from conducting step-by-step logical deductions in response to users queries, not from the process of training and creating the model in the first place. This reasoning process often yields better results but takes significantly more time. As a result, hardware with low latency (the time it takes for data to pass from one point on a network to another) is paramount. Data centers need to be located near major tech hubs to minimize transmission delays and ensure access to highly skilled operations and maintenance staff.This change means many data centers built in central, western, and rural Chinawhere electricity and land are cheaperare losing their allure to AI companies. In Zhengzhou, a city in Lis home province of Henan, a newly built data center is even distributing free computing vouchers to local tech firms but still struggles to attract clients.Additionally, a lot of the new data centers that have sprung up in recent years were optimized for pretraining workloadslarge, sustained computations run on massive data setsrather than for inference, the process of running trained reasoning models to respond to user inputs in real time. Inference-friendly hardware differs from whats traditionally used for large-scale AI training.GPUs like Nvidia H100 and A100 are designed for massive data processing, prioritizing speed and memory capacity. But as AI moves toward real-time reasoning, the industry seeks chips that are more efficient, responsive, and cost-effective. Even a minor miscalculation in infrastructure needs can render a data center suboptimal for the tasks clients require.In these circumstances, the GPU rental price has dropped to an all-time low. A recent report from the Chinese media outlet Zhineng Yongxian said that an Nvidia H100 server configured with eight GPUs now rents for 75,000 yuan per month, down from highs of around 180,000. Some data centers would rather leave their facilities sitting empty than run the risk of losing even more money because they are so costly to run, says Fan: The revenue from having a tiny part of the data center running simply wouldnt cover the electricity and maintenance cost.Its paradoxicalChina faces the highest acquisition costs for Nvidia chips, yet GPU leasing prices are extraordinarily low, Li says. Theres an oversupply of computational power, especially in central and west China, but at the same time, theres a shortage of cutting-edge chips.However, not all brokers were looking to make money from data centers in the first place. Instead, many were interested in gaming government benefits all along. Some operators exploit the sector for subsidized green electricity, obtaining permits to generate and sell power, according to Fang and some Chinese media reports. Instead of using the energy for AI workloads, they resell it back to the grid at a premium. In other cases, companies acquire land for data center development to qualify for state-backed loans and credits, leaving facilities unused while still benefiting from state funding, according to the local media outlet Jiazi Guangnian.Towards the end of 2024, no clear-headed contractor and broker in the market would still go into the business expecting direct profitability, says Fang. Everyone I met is leveraging the data center deal for something else the government could offer.A necessary evilDespite the underutilization of data centers, Chinas central government is still throwing its weight behind a push for AI infrastructure. In early 2025, it convened an AI industry symposium, emphasizing the importance of self-reliance in this technology.Major Chinese tech companies are taking note, making investments aligning with this national priority. Alibaba Group announced plans to invest over $50 billion in cloud computing and AI hardware infrastructure over the next three years, while ByteDance plans to invest around $20 billion in GPUs and data centers.In the meantime, companies in the US are doing likewise. Major tech firms including OpenAI, Softbank, and Oracle have teamed up to commit to the Stargate initiative, which plans to invest up to $500 billion over the next four years to build advanced data centers and computing infrastructure. Given the AI competition between the two countries, experts say that China is unlikely to scale back its efforts. If generative AI is going to be the killer technology, infrastructure is going to be the determinant of success, says Goodrich, the tech policy advisor to RAND.The Chinese central government will likely see [underused data centers] as a necessary evil to develop an important capability, a growing pain of sorts. You have the failed projects and distressed assets, and the state will consolidate and clean it up. They see the end, not the means, Goodrich says.Demand remains strong for Nvidia chips, and especially the H20 chip, which was custom-designed for the Chinese market. One industry source, who requested not to be identified under his company policy, confirmed that the H20, a lighter, faster model optimized for AI inference, is currently the most popular Nvidia chip, followed by the H100, which continues to flow steadily into China even though sales are officially restricted by US sanctions. Some of the new demand is driven by companies deploying their own versions of DeepSeeks open-source models.For now, many data centers in China sit in limbobuilt for a future that has yet to arrive. Whether they will find a second life remains uncertain. For Fang Cunbao, DeepSeeks success has become a moment of reckoning, casting doubt on the assumption that an endless expansion of AI infrastructure guarantees progress. Thats just a myth, he now realizes. At the start of this year, Fang decided to quit the data center industry altogether. The market is too chaotic. The early adopters profited, but now its just people chasing policy loopholes, he says. Hes decided to go into AI education next.What stands between now and a future where AI is actually everywhere, he says, is not infrastructure anymore, but solid plans to deploy the technology.
    0 Yorumlar ·0 hisse senetleri ·77 Views
  • The AI Hype Index: DeepSeek mania, Israels spying tool, and cheating at chess
    www.technologyreview.com
    Separating AI reality from hyped-up fiction isnt always easy. Thats why weve created the AI Hype Indexa simple, at-a-glance summary of everything you need to know about the state of the industry.While AI models are certainly capable of creating interesting and sometimes entertaining material, their output isnt necessarily useful. Google DeepMind is hoping that its new robotics model could make machines more receptive to verbal commands, paving the way for us to simply speak orders to them aloud. Elsewhere, the Chinese startup Monica has created Manus, which it claims is the very first general AI agent to complete truly useful tasks. And burnt-out coders are allowing AI to take the wheel entirely in a new practice dubbed vibe coding.
    0 Yorumlar ·0 hisse senetleri ·76 Views
  • A definition of vibe coding, or: how AI is turning everyone into a software developer
    blog.medium.com
    A definition of vibe coding, or: how AI is turning everyone into a software developer10-minute plays + the magic of taking things apart (Issue #296)Published inThe Medium BlogSent as aNewsletter3 min readJust now--In issue #282, we featured a story by product designer Ben Snyder, who used AI to build a rudimentary game in which you (an ostrich) must jump over a barrage of obstacles, and if you dont you die. Snyder and his kids built the game with Replit and v0, two apps that let you blink software into reality, as Pete Sena describes it on Medium. You can ask either app to build a game where you have to jump to avoid monsters, and theyll do so instantly.The term for this style of on-command software development is vibe coding Andrej Karpathy, cofounder of OpenAI, coined it last month and it instantly caught on. The idea: Instead of developers writing literal lines of code, anyone can direct AI to build based on a prompt and tweak from there. In Kaprathys words: its not really coding I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works.Vibe coding is a mindset more than a method. Its about giving into AIs potential giving into the vibe of AI-driven development rather than fighting it.Sena views vibe coding as simply the latest development in the Great Democratization Cycle of every technology. We saw this happen to photography (goodbye darkrooms, hello digital photos), publishing (goodbye printing press, hello blogging), even video and music production. Technology always cheapens the means of production, increasing productivity (the amount of photos taken, stories told, code written) and making truly innovative work that much more valuable.In the world vibe coding is creating, expertise still matters, but its a different type of expertise. Now that the gap between ideas and execution has been reduced to basically zero, well place even more of a premium on great ideas and elegant execution.And, when anyone (even me, a non-engineer) can generate a working prototype in seconds, well probably see tech jobs become less specialized. Sena predicts a world where:1. Product managers cant hide behind documents and wireframes theyll need to generate working prototypes2. Designers cant simply hand off mockups theyll need to implement their designs3. Marketers cant request custom tools theyll build their own analytics dashboards4. Executives cant claim technical ignorance theyll need to understand the systems they overseeAI-written code is certainly not a panacea, because good software doesnt just work its also maintainable. Sena writes, AI can produce code that works initially but falls apart under pressure, and only a good developer knows how to turn an AIs output into something that stands the test of time.Still, the bottleneck is no longer development speed, its knowing which problems are worth solving. Harris SockelWhat else were readingPopulation growth is decelerating yet life feels more crowded than ever because, over the last 20 years (a) people moved from suburbs to cities, and (b) public spaces were replaced by commercial ones. (Cleo Ashbee)In a world that pits diversity, equity and inclusion against merit, Im here to tell you that my success is due to both. Joshunda SandersA brief list of storytelling plots via David K. Farkas, whos written over 100 ten-minute plays. From a highly generalized point of view, he believes, it can be said that human beings tell a limited number of stories over and over again, and each one is some combination of the following:via David Farkas Your daily dose of practical wisdomThe next time something breaks in your home, dont rush to throw it away. Hand it to your child and see where their curiosity takes them. You might be surprised by the magic that unfolds. Oscar Delgadillo on letting his kids dismantle old clock radios and coffee makers as a way to teach them patience, attention to detail, and how to use their hands (as opposed to screens)
    0 Yorumlar ·0 hisse senetleri ·58 Views
  • Games London's Ensemble 2025 cohort announced
    www.gamesindustry.biz
    Games London's Ensemble 2025 cohort announcedAnnual exhibition features eight underrepresented games professionals, showcasing their work, talent, and storiesImage credit: London Games Festival | From left to right; Annabel Ashalley-Anthony, Jonas Gawe, John Giwa-Amu, Patrick Haraguti, Bulut Karakaya, Zakia Khan, Tara Mustapha, Sarah York News by Sophie McEvoy Staff Writer Published on March 26, 2025 The organisers of London Games Festival this year's cohort for its flagship Ensemble exhibition, championing the work and voices of Black, Asian, and underrepresented games professionals in the UK.Curated by author and artistic director Sharna Jackson, the Ensemble initiative showcases the work and stories of eight game developers and creatives from across the industry via exhibitions in various locations during the London Games Festival 2025.The first installation will be at LGF's New Game Plus (between April 3 and April 4), then at Trafalgar Square on April 11. LGF is supported by the Mayor of London, and delivered by Games London.This year's cohort includes:Annabel Ashalley-Anthony, CEO and founder of Melanin Games, has also worked for EA, Ubisoft, Square Enix, and ZA/UM Studio focusing on diversity and inclusion initiativesJonas Gawe, is a community manager and event organiser, who has worked with Into Games, Limit Break, and Electric Saint on its new game Crescent County and led masterclasses at industry eventsJohn Giwa-Amu, producer and founder of Good Gate Media, has worked on projects including The Complex, Five Dates, Ten Dates, Night Book, and Who Pressed Mute on Uncle MarcusPatrick Haraguti, creative director and founder of Ronin Game Studio, has over two decades of experience as a 3D artist and VFX supervisorBulut Karakaya, lead programmer at Ustwo, has contributed to projects including Monument Valley 3 and Desta: Between Memories has also co-founded two indie studiosZakia Khan, senior character artist and lead artist having worked for developers including Rare, Lucid Games, NaturalMotion, and Mediatonic GamesTara Mustapha, founder and CEO of Code Coven, has worked as a game designer on titles including Cartomancy, Detours, and Insecure The Come Up Game and partnered with companies including Google, Netflix, and MetaSarah York, founder of Panda Cat Games, previously worked at Team17, GREE, Sony Liverpool, and Rocksteady having contributed to games including Batman: Arkham Knight and Killzone: Shadowfall"Welcome to the seventh edition of Ensemble, the exhibition that foregrounds the importance of a diverse industry, by demonstrating its significance in the creation of rich and intoxicating worlds and experiences for everyone," said Jackson."Vibrant and essential work is being created by Black, Asian, and minority ethnic talent in the UK's games industry each and every day. Let's celebrate that. Let's highlight these achievements especially in this climate to encourage and support the next generation of talent as they emerge."
    0 Yorumlar ·0 hisse senetleri ·51 Views