0 Comments
·0 Shares
·63 Views
-
ChiefsAholic: A Wolf in Chiefs Clothing Review: A Felonious Superfan on Prime Videowww.wsj.comA documentary revisits the case of Xaviar Babudar, a Kansas City football obsessive who was well known at games and onlineand had another, decidedly more criminal pastime.0 Comments ·0 Shares ·62 Views
-
Why AI language models choke on too much textarstechnica.comScaling problems Why AI language models choke on too much text Compute costs scale with the square of the input size. That's not great. Timothy B. Lee Dec 20, 2024 8:00 am | 9 Credit: Aurich Lawson | Getty Images Credit: Aurich Lawson | Getty Images Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreLarge language models represent text using tokens, each of which is a few characters. Short words are represented by a single token (like "the" or "it"), whereas larger words may be represented by several tokens (GPT-4o represents "indivisible" with "ind," "iv," and "isible").When OpenAI released ChatGPT two years ago, it had a memoryknown as a context windowof just 8,192 tokens. That works out to roughly 6,000 words of text. This meant that if you fed it more than about 15 pages of text, it would forget information from the beginning of its context. This limited the size and complexity of tasks ChatGPT could handle.Todays LLMs are far more capable:OpenAIsGPT-4o can handle 128,000 tokens (about 200 pages of text).AnthropicsClaude 3.5 Sonnet can accept 200,000 tokens (about 300 pages of text).GooglesGemini 1.5 Pro allows 2 million tokens (about 2,000 pages of text).Still, its going to take a lot more progress if we want AI systems with human-level cognitive abilities.Many people envision a future where AI systems are able to do manyperhaps mostof the jobs performed by humans. Yet many human workers read and hear hundreds of millions of words during our working yearsand we absorb even more information from sights, sounds, and smells in the world around us. To achieve human-level intelligence, AI systems will need the capacity to absorb similar quantities of information.Right now the most popular way to build an LLM-based system to handle large amounts of information is called retrieval-augmented generation (RAG). These systems try to find documents relevant to a users query and then insert the most relevant documents into an LLMs context window.This sometimes works better than a conventional search engine, but todays RAG systems leave a lot to be desired. They only produce good results if the system puts the most relevant documents into the LLMs context. But the mechanism used to find those documentsoften, searching in avector databaseis not very sophisticated. If the user asks a complicated or confusing question, theres a good chance the RAG system will retrieve the wrong documents and the chatbot will return the wrong answer.And RAG doesnt enable an LLM to reason in more sophisticated ways over large numbers of documents:A lawyer might want an AI system to review and summarize hundreds of thousands of emails.An engineer might want an AI system to analyze thousands of hours of camera footage from a factory floor.A medical researcher might want an AI system to identify trends in tens of thousands of patient records.Each of these tasks could easily require more than 2 million tokens of context. Moreover, were not going to want our AI systems to start with a clean slate after doing one of these jobs. We will want them to gain experience over time, just like human workers do.Superhuman memory and stamina have long been key selling points for computers. Were not going to want to give them up in the AI age. Yet todays LLMs are distinctly subhuman in their ability to absorb and understand large quantities of information.Its true, of course, that LLMs absorb superhuman quantities of information at training time. The latest AI models have been trained on trillions of tokensfar more than any human will read or hear. But a lot of valuable information is proprietary, time-sensitive, or otherwise not available for training.So were going to want AI models to read and remember far more than 2 million tokens at inference time. And that wont be easy.The key innovation behind transformer-based LLMs is attention, a mathematical operation that allows a model to think about previous tokens. (Check out our LLM explainerif you want a detailed explanation of how this works.) Before an LLM generates a new token, it performs an attention operation that compares the latest token to every previous token. This means that conventional LLMs get less and less efficient as the context grows.Lots of people are working on ways to solve this problemIll discuss some of them later in this article. But first I should explain how we ended up with such an unwieldy architecture.GPUs made deep learning possibleThe brains of personal computers are central processing units (CPUs). Traditionally, chipmakers made CPUs faster by increasing the frequency of the clock that acts as its heartbeat. But in the early 2000s, overheating forced chipmakers to mostly abandon this technique.Chipmakers started making CPUs that could execute more than one instruction at a time. But they were held back by a programming paradigm that requires instructions to mostly be executed in order.A new architecture was needed to take full advantage of Moores Law. Enter Nvidia.In 1999, Nvidia started selling graphics processing units (GPUs) to speed up the rendering of three-dimensional games like Quake III Arena. The job of these PC add-on cards was to rapidly draw thousands of triangles that made up walls, weapons, monsters, and other objects in a game.This isnota sequential programming task: triangles in different areas of the screen can be drawn in any order. So rather than having a single processor that executed instructions one at a time, Nvidiasfirst GPUhad a dozen specialized coreseffectively tiny CPUsthat worked in parallel to paint a scene.Over time, Moores Law enabled Nvidia to make GPUs with tens, hundreds, and eventually thousands of computing cores. People started to realize that the massive parallel computing power of GPUs could be used for applications unrelated to video games.In 2012, three University of Toronto computer scientistsAlex Krizhevsky, Ilya Sutskever, and Geoffrey Hintonused a pair ofNvidia GTX 580 GPUsto train a neural network for recognizing images. The massive computing power of those GPUs, which had 512 cores each, allowed them to train a network with a then-impressive 60 million parameters. Theyentered ImageNet, an academic competition to classify images into one of 1,000 categories, andset a new record for accuracyin image recognition.Before long, researchers were applying similar techniques to a wide variety of domains, including natural language.Transformers removed a bottleneck for natural languageIn the early 2010s, recurrent neural networks (RNNs) were a popular architecture for understanding natural language. RNNs process language one word at a time. After each word, the network updates its hidden state, a list of numbers that reflects its understanding of the sentence so far.RNNs worked fairly well on short sentences, but they struggled with longer onesto say nothing of paragraphs or longer passages. When reasoning about a long sentence, an RNN would sometimes forget about an important word early in the sentence. In 2014, computer scientists Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengiodiscoveredthey could improve the performance of a recurrent neural network by adding an attention mechanism that allowed the network to look back at earlier words in a sentence.In 2017, Google publishedAttention Is All You Need,one of the most important papers in the history of machine learning. Building on the work of Bahdanau and his colleagues, Google researchers dispensed with the RNN and its hidden states. Instead, Googles model used an attention mechanism to scan previous words for relevant context.This new architecture, which Google called the transformer, proved hugely consequential because it eliminated a serious bottleneck to scaling language models.Heres an animation illustrating why RNNs didnt scale well:This hypotheticalRNN tries to predict the next word in a sentence, with the prediction shown in the top row of the diagram. This network has three layers, each represented by a rectangle. It is inherently linear: it has to complete its analysis of the first word, How, before passing the hidden state back to the bottom layer so the network can start to analyze the second word, are.This constraint wasnt a big deal when machine learning algorithms ran on CPUs. But when people started leveraging the parallel computing power of GPUs, the linear architecture of RNNs became a serious obstacle.The transformer removed this bottleneck by allowing the network to think about all the words in its input at the same time:The transformer-based model shown here does roughly as many computations as the RNN in the previous diagram. So it might not run any faster on a (single-core) CPU. But because the model doesnt need to finish with How before starting on are, you, or doing, it can work on all of these words simultaneously. So it can run alotfaster on a GPU with many parallel execution units.How much faster? The potential speed-up is proportional to the number of input words. My animations depict a four-word input that makes the transformer model about four times faster than the RNN. Real LLMs can have inputs thousands of words long. So, with a sufficiently beefy GPU, transformer-based models can be orders of magnitude faster than otherwise similar RNNs.In short, the transformer unlocked the full processing power of GPUs and catalyzed rapid increases in the scale of language models. Leading LLMs grew fromhundreds of millions of parametersin 2018 tohundreds of billions of parametersby 2020. Classic RNN-based models could not have grown that large because their linear architecture prevented them from being trained efficiently on a GPU.Transformers have a scaling problemEarlier I said that the recurrent neural network in my animations did roughly the same amount of work as the transformer-based network. But they dont do exactlythe same amount of work. Lets look again at the diagram for the transformer-based model:See all those diagonal arrows between the layers? They represent the operation of the attention mechanism. Before a transformer-based language model generates a new token, it thinks about every previous token to find the ones that are most relevant.Each of these comparisons is cheap, computationally speaking. For small contexts10, 100, or even 1,000 tokensthey are not a big deal. But the computational cost of attention grows relentlessly with the number of preceding tokens. The longer the context gets, the more attention operations (and therefore computing power) are needed to generate the next token.This means that the total computing power required for attention grows quadratically with the total number of tokens. Suppose a 10-token prompt requires 414,720 attention operations. Then:Processing a 100-token prompt will require 45.6 million attention operations.Processing a 1,000-token prompt will require 4.6 billion attention operations.Processing a 10,000-token prompt will require460 billionattention operations.This is probably why Google charges twice as much, per token, for Gemini 1.5 Pro once the context gets longer than 128,000 tokens. Generating token number 128,001 requires comparisons with all 128,000 previous tokens, making it significantly more expensive than producing the first or 10th or 100th token.Making attention more efficient and scalableA lot of effort has been put into optimizing attention. One line of research has tried to squeeze maximum efficiency out of individual GPUs.As we saw earlier, a modern GPU contains thousands of execution units. Before a GPU can start doing math, it must move data from slow shared memory (called high-bandwidth memory) to much faster memory inside a particular execution unit (called SRAM). Sometimes GPUs spend more time moving data around than performing calculations.In aseriesofpapers, Princeton computer scientist Tri Dao and several collaborators have developed FlashAttention, which calculates attention in a way that minimizes the number of these slow memory operations. Work like Daos has dramatically improved the performance of transformers on modern GPUs.Another line of research has focused on efficiently scaling attention across multiple GPUs. One widely cited paper describesring attention, which divides input tokens into blocks and assigns each block to a different GPU. Its called ring attention because GPUs are organized into a conceptual ring, with each GPU passing data to its neighbor.I once attended a ballroom dancing class where couples stood in a ring around the edge of the room. After each dance, women would stay where they were while men would rotate to the next woman. Over time, every man got a chance to dance with every woman. Ring attention works on the same principle. The women are query vectors (describing what each token is looking for) and the men are key vectors (describing the characteristics each token has). As the key vectors rotate through a sequence of GPUs, they get multiplied by every query vector in turn.In short, ring attention distributes attention calculations across multiple GPUs, making it possible for LLMs to have larger context windows. But it doesnt make individual attention calculations any cheaper.Could RNNs make a comeback?The fixed-size hidden state of an RNN means that it doesnt have the same scaling problems as a transformer. An RNN requires about the same amount of computing power to produce its first, hundredth and millionth token. Thats a big advantage over attention-based models.Although RNNs have fallen out of favor since the invention of the transformer, people have continued trying to develop RNNs suitable for training on modern GPUs.In April, Googleannounced a new modelcalled Infini-attention. Its kind of a hybrid between a transformer and an RNN. Infini-attention handles recent tokens like a normal transformer, remembering them and recalling them using an attention mechanism.However, Infini-attention doesnt try to remember every token in a models context. Instead, it stores older tokens in a compressive memory that works something like the hidden state of an RNN. This data structure can perfectly store and recall a few tokens, but as the number of tokens grows, its recall becomes lossier.Machine learning YouTuber Yannic Kilcherwasnt too impressedby Googles approach.Im super open to believing that this actually does work and this is the way to go for infinite attention, but Im very skeptical, Kilcher said. It uses this compressive memory approach where you just store as you go along, you dont really learn how to store, you just store in a deterministic fashion, which also means you have very little control over what you store and how you store it.Could Mamba be the future?Perhaps the most notable effort to resurrect RNNs is Mamba, an architecture that was announced in aDecember 2023 paper. It was developed by computer scientists Dao (who also did the FlashAttention work I mentioned earlier) and Albert Gu.Mamba does not use attention. Like other RNNs, it has a hidden state that acts as the models memory. Because the hidden state has a fixed size, longer prompts do not increase Mambas per-token cost.When I started writing this article in March, my goal was to explain Mambas architecture in some detail. But then in May, the researchers released Mamba-2, which significantly changed the architecture from the original Mamba paper. Ill be frank: I struggled to understand the original Mamba and have not figured out how Mamba-2 works.But the key thing to understand is that Mamba has the potential to combine transformer-like performance with the efficiency of conventional RNNs.In June, Dao and Guco-authored a paperwith Nvidia researchers that evaluated a Mamba model with 8 billion parameters. They found that models like Mamba were competitive with comparably sized transformers in a number of tasks, but they lag behind Transformer models when it comes to in-context learning and recalling information from the context.Transformers are good at information recall because they remember every token of their contextthis is also why they become less efficient as the context grows. In contrast, Mamba tries to compress the context into a fixed-size state, which necessarily means discarding some information from long contexts.The Nvidia team found they got the best performance from a hybrid architecture that interleaved 24 Mamba layers with four attention layers. This worked better than either a pure transformer model ora pure Mamba model.A model needssomeattention layers so it can remember important details from early in its context. But a few attention layers seem to be sufficient; the rest of the attention layers can be replaced by cheaper Mamba layers with little impact on the models overall performance.In August, an Israeli startup called AI21 announced itsJamba 1.5 familyof models. The largest version had 398 billion parameters, making it comparable in size to Metas Llama 405B model.Jamba 1.5 Large has seven times more Mamba layers than attention layers. As a result, Jamba 1.5 Large requires far less memory than comparable models from Meta and others. For example, AI21 estimates that Llama 3.1 70B needs 80GB of memory to keep track of 256,000 tokens of context. Jamba 1.5 Large only needs 9GB, allowing the model to run on much less powerful hardware.The Jamba 1.5 Large model gets an MMLU score of 80, significantly below the Llama 3.1 70Bs score of 86. So by this measure, Mamba doesnt blow transformers out of the water. However, this may not be an apples-to-apples comparison. Frontier labs like Meta have invested heavily in training data and post-training infrastructure to squeeze a few more percentage points of performance out of benchmarks like MMLU. Its possible that the same kind of intense optimization could close the gap between Jamba and frontier models.So while the benefits of longer context windows is obvious, the best strategy to get there is not. In the short term, AI companies may continue using clever efficiency and scaling hacks (like FlashAttention and Ring Attention) to scale up vanilla LLMs. Longer term, we may see growing interest in Mamba and perhaps other attention-free architectures. Or maybe someone will come up with a totally new architecture that renders transformers obsolete.But I am pretty confident that scaling up transformer-based frontier models isnt going to be a solution on its own. If we want models that can handle billions of tokensand many people dowere going to need to think outside the box.Tim Lee was on staff at Ars from 2017 to 2021. Last year, he launched a newsletter,Understanding AI,that explores how AI works and how it's changing our world. You can subscribehere.Timothy B. LeeSenior tech policy reporterTimothy B. LeeSenior tech policy reporter Timothy is a senior reporter covering tech policy and the future of transportation. He lives in Washington DC. 9 Comments0 Comments ·0 Shares ·69 Views
-
Why Enterprises Still Grapple With Data Governancewww.informationweek.comLisa Morgan, Freelance WriterDecember 20, 20249 Min ReadRancz Andrei via Alamy StockData governance isnt where it needs to be in many organizations, despite the widespread use of AI and analytics. This is risky on several levels such as cybersecurity and compliance, not to mention the potential impacts to various stakeholders. In short, data governance is becoming more necessary as organizations rely more heavily on data, not less.Steve Willis, principal research director, data, analytics, enterprise architecture and AI at Info-Tech Research Group offers a sobering statistic: Some 50% to 75% of data governance initiatives fail.Even in highly regulated industries where the acceptance and understanding of the concept and value of governance more broadly are ingrained into the corporate culture, most data governance programs have progressed very little past an expensive [check] boxing exercise, one that has kept regulatory queries to a minimum but returned very little additional business value on the investment, says Willis in an email interview.Most data professionals cite things like lack of business understanding and/or executive engagement, limited funding, the complexity of the data landscape or general organizational change resistance as the root-cause or causes as barriers to data governance implementation and the reason(s) why most data governance initiatives fail, though Willis disagrees.Related:A lack of a deep connection between the tangible outcomes business stakeholders care about and the activities and initiatives undertaken in the name of data governance is the primary cause of failure, says Willis. The few who have successfully implemented data governance can easily point to the value that data governance initiatives have delivered. [They are] able to provide a direct line of sight not only to tactical wins but to deep contributions to an organization achieving its strategic goals and objectives.Where the Problems LieMany data teams, particularly data governance teams, lack the proper relationships with business stakeholders, so the business has no visibility into how data governance works.Data governance teams should be rigorously focused on understanding how improvements in [data use] will tangibly make life easier for those managing and using data, be it removing critical pain points or creating new opportunities to add value, says Info-Techs Willis. By not focusing on their customers needs, many data governance professionals are over-focused on adding workload to those they are purporting to help in return for providing little measurable value.Related:Steve Willis, Info-Tech Research GroupWhy the disconnect? Data teams dont feel they can spend time understanding stakeholders or even challenging business stakeholder needs. Though executive support is critical, data governance professionals are not making the most out of that support. One often unacknowledged problem is culture.Unfortunately, in many organizations, the predominant attitude towards governance and risk management is that [they are] a burden of bureaucracy that slows innovation, says Willis. Data governance teams too frequently perpetuate that mindset, over-rotating on data controls and processes where the effort to execute is misaligned to the value they release.One way to begin improving the effectiveness of data governance is to reassess the organizations objectives and approach.Embed data governance activities, small step by small step into your current business operations, make managing data part of a business process owners day to day responsibilities rather than making the governance and management of data a separate thing, saysWillis. This abstraction of data governance and management away from business operations is a key reason why nominated data stewards, who are typically business process owners, dont understand what they are being asked to do. As a data governance team, you need to contextualize data management activities into the language the business understands and make it a part of what they do.Related:Common Mistakes and How to Avoid ThemBusinesses are struggling to make data accessible for users and protect it from misuse or breaches. This often results in either too much bureaucracy or insufficient control, leaving organizations vulnerable to inefficiencies and regulatory fines.The solution is to start small, focus on delivering results, and build from there. Begin with high-priority areas, like fixing compliance gaps or cleaning up critical datasets, to show quick wins, says Arunkumar Thirunagalingam, senior manager, data and technical operations at healthcare company McKesson, in an email interview. These early successes help build momentum and demonstrate the value of governance across the organization.He says the biggest mistakes companies make include trying to fix everything at once, relying too much on technology without setting up proper processes and ignoring the needs of end users.Overly restrictive governance often leads to workarounds that create even more problems, while waiting until a crisis forces action leaves companies in a reactive and vulnerable position, says Thirunagalingam. [W]hen done right, data governance is much more than a defense mechanism -- its an enabler of innovation and efficiency.Stephen Christiansen, principal security consultant at cybersecurity consulting firm Stratascale,says the shortage of data professionals, exploding data growth, and ever-increasing requirements for AI and data security are causing organizations to take a more conservative approach.Companies need to be continually investing in data technologies that help them manage, secure, and integrate data across their enterprise systems, says Christiansen in an email interview. Internally, companies need to [build] a data-driven culture, so employees better understand the importance of data governance and how it benefits them.David Curtis, chief technology officer at global fintech RobobAI, says the average amount of data is growing 63% monthly. The speed and velocity of this growth is overwhelming, and companies are struggling to manage the storage, protection, quality, and consistency of this data.Data is often collected in multiple different ERPs across an organization. This often means that data is disparate in format and incomplete. Eighty percent of companies estimate that 50% to 90% of their data is unstructured, says Curtis in an email interview. Unstructured data creates challenges for large organizations due to its lack of standardization, making it difficult to store, analyze, and extract actionable insights, while increasing costs, compliance risks and inefficiencies.Companies need to start with a data governance strategy. As part of that, they need to review relevant business goals, define data ownership, identify reference data sources, and align the data governance strategy KPIs. For ongoing success, they need to establish an iterative process of continuous improvement by developing data processes and committing to a master data governance framework.For every dollar you invest in AI you should invest five dollars in data quality. In my experience, the most common data challenges are due to a lack of clear objectives and measurable success metrics around master data management initiatives, says Curtis. Often insufficient or poor-quality data, often at scale, and limited integration with existing systems and workflows, prevents scalability and real-world application. Evolving regulations are also adding fuel to the fire.Organizations are continually challenged with complying with the constant stream of regulations from various jurisdictions, such as GDPR, HIPAA, and CCPA. These regulations keep evolving, and just when IT leaders think theyve addressed one set of compliance requirements, a new one emerges with slight nuances, necessitating continuous adjustments to data governance programs, says Kurt Manske, information assurance andcybersecurity leader at professional services firm Cherry Bekaert. The reality is that companies cant simply pause their operations to align with these ever-changing regulations. Consequently, developing, deploying and managing a data governance program and system is a lot like changing tires on the car as it goes down the highway. [Its] an undeniably daunting task.This underscores the need to establish a resilient culture versus a reactive one.Leading companies see regulatory compliance as a differentiator for their brand and products, says Manske in an email interview. [One] key reason data governance programs and system deployment projects fail is that organizations try to take on too much at once. Big bang deployment strategies sound impressive but they often encounter numerous technical and cultural problems when put into practice. Instead, a metered or scaled deployment approach across the enterprise allows the team, vendor and governance leadership to continuously evaluate, correct and improve.The Sobering TruthOrganizations that lack strong governance are drowning in data, unable to harness its value, and leaving themselves vulnerable to growing cyber threats. According to Klaus Jck, partner at management consulting firm Horvth USA, incidents like the recent CrowdStrike breach are stark reminders of whats at stake. Data quality issues, silos, unclear ownership and a lack of standardization are just the tip of the iceberg.Klaus Jck, Horvth USAThe root cause of these struggles is simple: Data is everywhere. Thanks to new sensor technologies, process mining and advanced supervisory systems, data is produced at every step of every business process, says Jck in an email interview. The drive to monetize this data has only accelerated its growth. Unfortunately, many organizations are simply not equipped to manage this deluge.A truly effective strategy must go beyond policies and frameworks; it must include clear metrics to measure how data is used and how much value it creates. Assigning ownership is also key -- data stewards can help create a control environment sensitive to the nuances of modern data sources, including unstructured data.Failing to connect governance to business goals or neglecting executive sponsorship are major mistakes, says Jck. Poor communication and training also derail efforts. If employees dont understand governance policies or dont see their value, progress will stall. Similarly, treating governance as a one-time project rather than an ongoing process ensures failure.Dimitri Sirota, CEO and co-founder at security, privacy, compliance, and AI data management company BigID, says the root cause of data governance challenges often stem from poor data quality and insufficient governance frameworks.Inconsistent data collection practices, lack of standardized formats for key data elements such as dates and numeric values, and failure to monitor data quality over time exacerbate the problem, says Sirota in an email interview. Additionally, organizational silos and outdated systems can perpetuate inconsistencies, as different teams may define or manage data differently. Without a rigorous framework to identify, fix and monitor data issues, organizations face an uphill battle in maintaining reliable, high-quality data.Ultimately, the absence of a centralized governance strategy makes it difficult to enforce standards, creating noise and clutter in data environments.Marc Rubbinaccio, head of compliance at security compliance provider Secureframe, points to a related issue, which is understanding where sensitive data resides and how it flows within organizations.[T]he rush to adopt and implement AI within organizations and software products has introduced new risks, says Rubbinaccio in an email interview. While the efficiency gains from AI are widely recognized, the vulnerabilities it may introduce often go unaddressed due to a lack of thorough risk evaluation. Many organizations are bypassing detailed AI risk assessments in their eagerness to stay ahead, potentially exposing themselves to long-term consequences.About the AuthorLisa MorganFreelance WriterLisa Morgan is a freelance writer who covers business and IT strategy and emergingtechnology for InformationWeek. She has contributed articles, reports, and other types of content to many technology, business, and mainstream publications and sites including tech pubs, The Washington Post and The Economist Intelligence Unit. Frequent areas of coverage include AI, analytics, cloud, cybersecurity, mobility, software development, and emerging cultural issues affecting the C-suite.See more from Lisa MorganNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports0 Comments ·0 Shares ·80 Views
-
Retailers: Learn From the Holidays To Build Year-Round Resiliencewww.informationweek.comGanesh Seetharaman, Managing Director, Deloitte ConsultingDecember 20, 20244 Min ReadValentin Valkov via Alamy StockDuring peak times like holiday periods, retailers, consumer goods companies, insurance firms, and others involved in seasonal crunch-time sectors face a delicate balance between opportunity and risk. Seasonal spikes can be a stringent test for executives, revealing the strength of their business and operational resilience. To understand why, just think back to recent incidents with organizations that may have experienced mass website outages due to holiday spikes or that suffered prolonged log-in issues.Indeed, downtime during peak periods can result in financial impacts measured in millions of dollars per hour, so its clear that the user experience is paramount. Even minor issues can lead to significant consequences, including customer churn, wasted ad spending, and long-term brand damage. The takeaway? Failure when the world is watching can have cascading effects, and a track-record of 99.99% uptime is insufficient if the 0.01% downtime occurs at critical moments. With that in mind, lets explore a strategic approach to building game-ready resilience.Game-ready resilience means that your systems can manage adversity -- from ecosystem impacts, including third-party services -- to unprecedented traffic peaks. Most importantly, it also means creating a culture of reliability with constant learning and cross-functional teams that understand the business impacts of downtime and can respond effectively to outages.Related:To enhance business and operational resilience during the holidays, tech leaders should focus on four key areas.1. Forecast and define measurable requirements.Start enhancing resilience by developing a reliable forecast of expected transaction volumes and user behavior. Seek to understand normal traffic patterns as well as how spikes in traffic might affect your systems during peak periods. Prioritize critical services; for example, with an e-commerce platform, the checkout process should take precedence over less-essential features like recommendation engines.Use service level objectives (SLOs) to define availability expectations and measure them. For instance, aim for 99.99% shopping-cart availability -- which you can foster by forecasting transaction volumes across all channels. Then, translate those forecasts into performance requirements like the ability to accommodate a specific number of concurrent users while meeting reliability expectations. It's also crucial to identify potential architectural bottlenecks and failure points.2. Map dependencies and mitigate risks.Related:Modern retail ecosystems are complex webs of internal systems and third-party services. To identify vulnerabilities and mitigate risk, create a comprehensive map of all dependencies. Then, assess the services scalability and reliability, and develop failure contingency plans that include circuit breakers and fallback options.In addition to infrastructure, focus on key business and foundational services, especially in hybrid and multi-cloud environments. Next, to build agility and minimize recovery time, develop a clear view of all dependency layers and build fault tolerance. An example of dependency management could look like an e-commerce organization simplifying its shipping infrastructure to achieve more efficient package delivery.3. Implement robust reliability checks.Establish clear, measurable reliability objectives aligned with business outcomes. For example, you might set granular targets, such as sub-2-millisecond log-in times. Such metrics create a common language across development, operations, and business teams, fostering a unified approach to reliability. Also, to ensure build stability, avoid last minute changes, and implement rigorous process controls for continuous validation.Related:Integrate SLOs and synthetic monitoring into your operational framework. Develop real-time observability solutions that provide actionable insights and rapid response capabilities. Implement observability to balance innovation and stability during peak loads and align technical metrics with indicators like net promoter scores. Also, adopt site reliability engineering to translate technical metrics directly into customer experience.4. Develop and refine incident-response procedures.Swift and effective responses to system challenges can prevent minor issues from becoming major crises. So, its essential to develop incident response procedures that include comprehensive system dependency maps that create communication channels, action plans, and escalation pathways that help minimize confusion. Automatic failure notifications are a must as well, as are self-healing approaches to incidents and solutions driven by error budgets and burn rates.Next, ensure organizational readiness through training, communication protocols, and regular response drills. Implement proactive monitoring systems to detect and address issues early. Also, learning from high-profile incidents underscores the importance of transparent, timely communication during disruptions.The Path ForwardBuilding resilience requires both a cultural and technical shift to align critical services with customer journeys, refine resilience policies, and adapt to changing demands. Practices like game day drills enhance readiness, reinforcing that resilience is an ongoing effort that requires continuous refinement, not a one-time project. True resilience requires a holistic approach that ensures people, processes, and technology work in sync to handle both surges and scale-downs effectively. By adopting the strategies weve discussed here, you can prepare your systems for peak times while building stronger, more resilient year-round operations.About the AuthorGanesh SeetharamanManaging Director, Deloitte ConsultingGanesh Seetharaman is a managing director at Deloitte Consulting LLP. He leads Deloittes Technology Resiliency market offering and is recognized for delivering innovative solutions for his clients, as well as for helping organizations navigate technology challenges and capitalize on market opportunities.See more from Ganesh SeetharamanNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports0 Comments ·0 Shares ·79 Views
-
Engaging new podcast asks what the big things are that make us humanwww.newscientist.comSkeleton of Homo naledi, found in the Rising Star cave system in South AfricaJohn Hawks/ShutterstockAustralian Museum, University of Sydney, BreakThru FilmsIt is after 10pm and I am on a cycleway in Sydney returning from dinner with friends. It is a warm evening in the week before Christmas and people are still out on the streets, gathering for end-of-year drinks.As I cycle, Im using my Air Pods to listen to a podcast broadcast by Bluetooth from my smartphone. The podcast, downloaded from invisible Wi-Fi, is about the origins of humanity. It strikes me that,0 Comments ·0 Shares ·63 Views
-
Our writers pick the 26 best science fiction short stories of all timewww.newscientist.comIs your favourite sci-fi short story in this list?Sometimes youre in the mood for a slice of science fiction but you havent got the time to embark on a Red Mars or a Dune. All hail, then, the sci-fi short story, bringing you a slice of the weird, the mind-expanding and the futuristic in pocket-sized format.Did you know thatHugo Gernsback, after whom science fictions biggest awards, the Hugos, are named, came up with the term science fiction (or scientifiction as he had it) as he launched the first edition of his sci-fi story magazine Amazing Stories, in 1926? By scientifiction I mean the Jules Verne, H G Wells and Edgar Allan Poe type of story a charming romance intermingled with scientific fact and prophetic vision, he wrote. Not only do these amazing tales make tremendously interesting reading they are also always instructive.Perhaps the stories in the selection below arent always instructive. They certainly arent comprehensive. But, chosen by New Scientist staff as their own personal favourites and arranged in order of publication, they are definitely a good read. Enjoy reconnecting with the ones you already know, dive into those you dont and tell us what weve missed on our Facebook page. We have provided links where these stories have been made available to read online.AdvertisementThe Time Machine by H. G. Wells (1895)Wellss Time Traveller tells us the story of his visit to the far future (the year Eight Hundred and Two Thousand Seven Hundred and One, A.D.), when the world is in a condition of ruinous splendour, peopled by the Eloi and the Morlocks. What has really stayed with me from the classic sci-fi novella, though, was his journey even further forward in time, to a terrifying future vision. I cannot convey the sense of abominable desolation that hung over the world. The red eastern sky, the northward blackness, the salt Dead Sea, the stony beach crawling with these foul, slow-stirring monsters, the uniform poisonous-looking green of the lichenous plants, the thin air that hurts ones lungs: all contributed to an appalling effect. So evocative and brilliant, this was published in 1895 (note the plentiful Trump-esque capitals) and is one of the earliest pieces of science fiction, says Britannica. Alison FloodThe Machine Stops by E. M. Forster (1909)Within the massive apparatus in E. M. Forsters take on the smart home, each individual lives in an underground room that meets all their physical needs and communicates with other humans via a technology akin to video calls. Most characters are perfectly happy to live out their days in isolation, although some insist on travelling through the hostile environment outside in order to meet face to face. Eventually, the first perspective wins out. But when the machine finally breaks down, its cosseted inhabitants face the end of the world.More than a hundred years after this story was first published, it feels incredibly prescient. In 2020, I sat in my apartment in front of a glowing computer screen, my friends and coworkers reduced to rectangles in a videoconference app, and I felt the walls of the machine surround me. I felt them again last year, when the air was so tainted with wildfire smoke that the horizon turned orange and any New Yorker who was able retreated indoors once again. As Forster predicted, the machine can be comforting in the face of an unsafe world and at the same time, its so stifling that it makes us long for even scraps of the open sky. Sophie BushwickNightfall by Isaac Asimov (1941)This fun and absorbing early story from Isaac Asimov is almost as if H. P. Lovecraft had ventured into science fiction, creating astronomy-based cosmic horror. It is a searing study of how humans react in the face of the unknown. Imagine a world lit by six stars, having them near enough that you are always bathed in light from at least one of these celestial orbs, making daylight an unassailable constant for more than two thousand years. This luminance is so much relied on that no one has ever needed to invent artificial lights. And then, in a rare astronomical event, the lights go out, and the eclipse lasts not a few minutes, but half a day. Yes, it would be darkness, but not darkness as we know it, which can be scary and full of the unknown. This is darkness for a civilisation that has never seen a night, that has never had to find a candle or torch during a power cut, or traverse a city park after dark, not knowing what threats might be hiding in the shadows. It is a story that compels you to make the intellectual leap to understand what life on another world might really be like and it is well worth reading for that exercise alone. Chris SimmsThe Lottery by Shirley Jackson (1948)Shirley Jackson is author of one of the scariest novels in the world (The Haunting of Hill House) and one of the most brilliantly unsettling (We Have Always Lived in the Castle). So it is only to be expected that she would also be the author of one of the most quietly disturbing speculative short stories ever written, The Lottery. It takes place in a nondescript rural village, where the locals are gathering for the lottery. It sounds like its going to be fun. Kids are collecting stones. Everyone knows what is going to happen; they dont think much of neighbouring villages who have got rid of their lotteries (Listening to the young folks, nothings good enough for them. Next thing you know, theyll be wanting to go back to living in caves). But a trickle of unease begins to spread, as the lottery draw looms nearer. If you dont know what the twist is, I wont spoil it, but I just read this again and I still feel a little shaky. Jackson is a stone-cold genius. Alison FloodThere Will Come Soft Rains by Ray Bradbury (1950)Theres a reason the smart home is a staple of science fiction (see my other pick, E. M. Forsters The Machine Stops, above). Who wouldnt dream of a house that doesnt merely protect you from the elements, but also caters to your every need? The smart home offers the luxury of having servants, without requiring any pesky interactions with other people. But once you remove the humans who serve from the domestic sphere, you start to wonder what would happen if you also eliminated the ones who are served. Thats the scenario that plays out in Ray Bradburys creepy, beautiful There Will Come Soft Rains. This story tracks the activity of a smart home devoid of its inhabitants. Still, the reader can figure out what must have been the rhythms of their daily lives, their taste in poetry and even the fate that befell them by observing the homes layout, decor and its ongoing automated processes. Without humans in the loop, however, the dwelling is revealed as a sterile, heartless place that destroys the lone living creature that enters and eventually devours itself. Sophie BushwickRay BradburySophie Bassouls / Sygma via Getty ImagesThe Pedestrian by Ray Bradbury (1951)If a dystopian story where cars dominate cities, people spend sedentary evenings gazing at screens and AI-powered police robots fail to grasp human motivations was published today it may come across as over-egged. But Bradburys The Pedestrian is 73 years old.Its protagonist, Leonard Mead, is hauled away to an institution by a driverless police car that cant fathom why hed be strolling at night with no purpose. The incident is mentioned in Bradburys later novel Fahrenheit 451, suggesting that they inhabit the same world, and the idea reportedly came to him when he was interrogated by police for walking in Los Angeles in 1949.Things dont get much more dystopian than reframing a post-dinner stroll as a rebellious act, but the story has valuable messages about the society we have since constructed that is increasingly difficult to navigate without technology and how we maintain humanity in the face of progress. And the unflinching AI that refuses to accept Meads explanation should give us all pause for thought as we entrench large language models into every aspect of our lives. Matthew SparkesThe Nine Billion Names of God by Arthur C. Clarke (1953)This 1953 story from Clarke starts gloriously whimsically it is the first time, we learn, that anyones been asked to supply a Tibetan monastery with an Automatic Sequence Computer (they probably all have them these days). The monks want the computer to aid them in their quest to complete a list containing all possible names of God. What would have taken us fifteen thousand years it will be able to do in a hundred days. The engineers roll their eyes and comply but what will happen when if the computer fulfils its task? Short, clever and deliciously unsettling as it ends. Alison FloodAll You Zombies by Robert Heinlein (1958)Before stories such as Dark, Looper, Back to the Future and Doctor Who, Robert Heinlein delivered one of the most memorable time travel paradoxes ever conceived in his 1958 short story All You Zombies. But dont be fooled by the title there are no shambling hordes of the walking dead to be found. Instead, the story begins with a bartender serving up shots to a customer while coaxing the latter into sharing their personal circumstances and incredible life story. It is a standard storytelling scene with a twist that is telegraphed in the opening paragraph, because the bartender is actually a temporal agent recruiting the customer to join a shadowy organisation that manipulates the timeline through time travel. Before long, the conversation takes some unexpected but increasingly personal turns for both people. Heinlein supposedly wrote All You Zombies in a single day and you can read it within half an hour but dont be surprised if the story slithers into your subconscious and nests in its coils there for years to come. Jeremy HsuCliff Robertson in Charly, a 1968 adaptation of Flowers for AlgernonAlamy Stock PhotoFlowers for Algernon by Daniel Keyes (1959)Every so often, you come across a story that has such a simple yet brilliant idea that you wonder why no one else thought of it before. Flowers for Algernon charts the progress of Charlie Gordon, a man with an IQ of 68, who is given the same surgical treatment as Algernon, a lab mouse that has had its intellect tripled. Charlies rise in intellect is brilliantly portrayed through the standard of his diary entries. But alongside his intellectual development come painful and cruel realisations as Charlie begins to see people around him for what they really are. And then Algernon starts to decline. Will the same happen to Charlie? I read the award-winning novel version of this poignant and moving tale before I found the original short story it was expanded from, which itself won the 1960 Hugo Award for Best Short Story. If anything, the short version is better subtly taking you through sympathy, pity, outrage and sadness. Like all the best science fiction, although based in science, it is actually about the human condition. It puts a critical lens on how people judge others and makes you question what it means to fit in and whether intelligence and knowledge are more important than happiness. Chris Simms2 B R 0 2 Bby Kurt Vonnegut (1962)Vonneguts story is set in a world where old age has been conquered, and where there are strict population controls. If you want to have a baby, someone has to volunteer to die, by calling the telephone number of the municipal gas chambers of the Federal Bureau of Termination. Its 2 B R 0 2 B. (Try saying it the 0 is nought.) We are following the choices of a soon-to-be-father of triplets, as a doctor tells him he needs to line up three deaths if his kids are to survive. In the year 2000, said Dr. Hitz, before scientists stepped in and laid down the law, there wasnt even enough drinking water to go around, and nothing to eat but sea-weedand still people insisted on their right to reproduce like jackrabbits. And their right, if possible, to live forever. Written in 1962, it still feels very timely these days. Alison FloodWe Can Remember It for You Wholesale by Philip K. Dick (1966)If you ever daydream of escaping your mundane job and seeing something incredible, you might well empathise with Douglas Quail, who wakes up every morning wanting to see the wonders of another world. It might be an unobtainable dream for a low-earning clerk, but he wants to do what the rich and powerful can do and visit Mars. Why he yearns so strongly for it is a mystery that is slowly unveiled in this rollercoaster 19-page short story that inspired the two Total Recall films, starring Arnold Schwarzenegger and Colin Farrell, respectively. The ideas are the same, but dont expect the same plot. Its an inventive, irreverent ride, delving into wish fulfilment and reality and scattered with more than a soupon of humour. There is a rich vein of paranoia running through the tale as you realise that memories and thus reality arent to be trusted. And like the central red pill/blue pill dilemma of The Matrix, it leaves you realising we all have a choice to make: is it better to strive and fight for a dream, to make yourself matter, or to bob along as a salaried employee inside a world that somehow doesnt feel real, but is at least comfortable? Douglas Quail has to make that choice and so do you. Chris SimmsWhen It Changed by Joanna Russ (1972)A contemporary of Ursula K. Le Guin, Joanna Russ was one of the preeminent writers in the second-wave feminism era of science fiction. Her stories explored womens lives with an edge of anger that Russ owned to, proudly, in her conversations with other writers. When it Changed is a perfect, self-contained slice of that anger, laid out against the backdrop of an already-lost utopia. It takes place on a planetary colony called Whileaway, where two women named Janet and Katy live a happy married life. Thanks to a revolutionary technology that merges two ova into a single embryo, they have three daughters that are descended from them both. Katy is a talented machinist, while Janet alludes to a history of combat and necessary violence. Janet narrates as the pair joins the rest of their community in welcoming visitors from their long-forgotten homeland men.It turns out these are the first men on Whileaway seen since a plague killed the colonys entire male population generations earlier. It was a catastrophe the surviving women adapted to, even while mourning the lost. But what will happen as men from Earth, now suffering its own catastrophes, rediscover this planet? Theres not much to say of plot: this story spans a single afternoon, just a handful of conversations that slip back and forth across lines of power and feeling. Yet you know, by the end of it, that what you have witnessed is the beginning of a cataclysm. For whom, well, maybe you can guess. Christie TaylorUrsula K. Le GuinBeth Gwinn / Getty ImagesThe Ones Who Walk Away From Omelas by Ursula K. Le Guin (1973)It is the Festival of Summer in Omelas, and everyone is happy. Bells and birds, prancing horses and everywhere children cavort. Omelas is a city with red roofs and moss-grown gardens. It doesnt matter when in time we are, only that this place should be understood to be singular in the history of humanity. Because everyone, truly, is happy. Our narrator, positioned outside Omelas, speculates: perhaps in Omelas there might be technology the likes of which we could not understand. But definitely not cars, nor war. As they did without monarchy and slavery, so they also got on without the stock exchange, the advertisement, the secret police, and the bomb, writes Le Guin.The twist of this story is now famous I wont tell you. But even before we find the dark centre of this supposed utopia, the narrator is in conversation with you, the reader, as you look for a catch in all this rollicking joy. Surely the problem is that everyone is too happy, naive? Surely pain is the foundation of intellect? O miracle, the narrator responds, the citizens of Omelas are fully formed, mature and passionate adults. This distrust says more about the readers failure to imagine.Speculative fiction writers speak often about our need to dream up better worlds. But you are reminded, with Omelas, to question your imagination even as you nurture it. To find in every utopia someones dystopia. And to ask about those centred by this storys title: what exactly happens to those who walk away? Christie TaylorThe Screwfly Solution by Raccoona Sheldon (1977)Another of the feminist second wave, James Tiptree Jr. (writing here under the pen name Raccoona Sheldon) was in conversation with Russ and Le Guin literally, in fact, as the author corresponded in letters with both. Ten years into Tiptrees writing career, a determined fan discovered that Tiptree was in fact a woman named Alice Sheldon a former intelligence officer in the second world war and, later, an experimental psychologist. But even as Tiptree, Sheldon posed as a feminist man, whose works often touched on gender including another story about women learning to get along just fine when men are wiped off the face of the Earth.The Screwfly Solution is not that story (that story is Houston, Houston, Do You Read?). Instead, it is a series of letters: between a husband, Alan, and his wife, Anne, as Alan conducts research on parasitic flies far from their Michigan home. Meanwhile, an epidemic of violent misogyny is spreading with a strangely precise pattern will scientists discover the cause?Many things make this story great: the shifting narration, the assorted uselessness of journalism and research papers, the sinking dread as the end of the story approaches like a slow-moving but underailable train, even the entomological metaphor of the titles screwflies. But even more so, I think, is how timeless it remains. Even half a century later, the chill of reading it goes deep and lingers long. Christie TaylorSandkings by George R. R. Martin (1979)This slice of sci-fi horror from the author who is still writing The Winds of Winter (come on George!) opens as pet owner Kress goes out looking for a new animal. I want something exotic. Unusual. And not cute. I detest cute animals. At the moment I own a shambler. Imported from Cotho, at no mean expense. From time to time I feed him a litter of unwanted kittens. That is what I think of cute. Do I make myself understood? He ends up with a colony of sandkings, small, insectile alien creatures who share a hivemind and are fed by a maw that he keeps in his old piranha tank. Needless to say, things dont go to plan in this fun and disturbing tale. Alison FloodFire Watch by Connie Willis (1982)There is a popular what-if scenario of going back in time to assassinate Adolf Hitler before he can start the second world war. Connie Williss 1982 novelette Fire Watch takes a completely different tack by immediately plunging its time-travelling narrator into confusion as he appears in London during the Nazi German Luftwaffes bombing raids in 1940. The narrator is tasked with joining fellow volunteers in the seemingly Sisyphean task of putting out incendiary bombs on the roof of St Pauls Cathedral that threaten to burn down the hallowed landmark, even as he struggles with his real assignment of trying to figure out why his history professors have chosen to send him back to that harrowing period without adequate education or preparation. As an added complication, the narrator begins to suspect a fellow fire watch member of subversive wartime activities while he himself struggles to blend in and avoid blowing his cover with the locals. As the narrative follows a series of dated diary entries from the increasingly paranoid and exhausted narrator, Williss story shines by treating time travel as a tool used judiciously by historians to bear witness and deepen their understanding of humanity, rather than depicting it as a superpower for manipulating the past or future. Jeremy HsuBurning Chrome by William Gibson (1982)From the first line, It was hot, the night we burned Chrome, this story grabs you and drags you into cyberspace. William Gibsons vision of the future has always been stark. Its not a dreamily futuristic world of clean new technology, it is a perhaps more realistic mishmash of old and new, with hands-on people adapting to change by retrofitting and hacking devices together. Neon lights illuminate hard criminals and doomed love. In this fantastic story, we meet Bobby and Jack, two computer cowboys. Jack goes to buy the digital equivalent of a knife to help give them an advantage when hacking and comes home with a metaphorical neutron bomb. And it could change everything for them.Its a rollicking ride, and a great introduction to Gibsons Sprawl series, which established cyberpunk as a literary movement.That series kicks off with Neuromancer, still one of my favourite science fiction books ever. If you read Burning Chrome in Gibsons collection of short stories with the same name, you will also find two other Sprawl stories there, both worth reading and both of which have inspired films. In Johnny Mnemonic, you meet Molly Millions, the chillingly wonderful razorgirl or street samurai from Neuromancer, for the first time. She will have you wanting to don mirrorshades. And the other, New Rose Hotel, is a wonderful, high-tech, low-life tale of corporate espionage. All the Sprawl stories leave you with the nagging feeling that despite technology allowing people to connect so easily, people are still very much lonely, a dystopian outlook that TV shows like Black Mirror have more recently mined to great success. Chris SimmsBloodchild by Octavia E. Butler (1984)Octavia Butler is, in my opinion, one of the greatest science fiction writers (see my review of her novel Kindred here), but she didnt write many short stories. Those she did are excellent imaginative, thought-provoking and worth seeking out. My favourite is Bloodchild, which won the Nebula, Hugo and Locus awards and can be found in the book Bloodchild and Other Stories. A colony of humans have left Earth and now live on a planet inhabited by the Tlic. When the Tlic discovered that humans are the perfect host for their eggs, they let them stay on the proviso that each family provides a child to host Tlic eggs. This compelling story follows Gan as he works through his feelings and the reality about imminently becoming a host. There is a mixture of body horror Butler said she was partly inspired by the life cycle of a botfly love and tenderness, and I enjoyed the exploration of the idea of male pregnancy in an unexpected way. Bloodchild is a thoughtful look at relationships between species, and the pressures placed on young people to do what is in the best interests of their families. I think about it often. Eleanor ParsonsSwarm by Bruce Sterling (1982)I came across Bruce Sterlings short story Swarm after finishing his novel set in the same universe, Schismatrix. The short story appeared at the end of the novel and, craving more of Sterlings kaleidoscopic space society, I dived straight in. After just a few pages, I had this strange feeling of familiarity. A few pages later, it hit me. Swarm had been made into an episode of the Netflix show Love, Death and Robots of which I am a huge fan, and this episode was a particular favourite. The story is set in an alien nest located within an asteroid hurtling through space. The insect-like aliens live in a perfect society where the food is plentiful, the air is warm and everything works as it should. The human characters, Afriel and Mirny, attempt to steal the secrets of this utopia and use it for human purposes. However, their actions lead to the creation of a new insectoid alien designed for intelligence who is charged with preventing Afriel from exploiting the secrets of the swarm. This story has gore, philosophy, romance and aliens all rolled up into one. Read Swarm and then watch the Love, Death and Robots episode, or do it the other way around like me. Both would work. Finn GrantJason Winston George as Afriel and Rosario Dawson as Dr Mirny in a Love, Death & Robots episode adapted from Swarm 2022 Netflix, Inc.I Who Have Never Known Men by Jacqeline Harpman (1995)The enigmatic dystopian novella I Who Have Never Known Men by Jacqeline Harpman has haunted me since I finished it. It opens with 39 women and one girl who have been locked in a cage underground for an unknown number of years, closely watched by three guards at all times. None know how they got there. Then, one day, as the guards are delivering food, an alarm goes off and the guards run off in a panic, leaving a hatch unlocked. The women make their escape into well, I wont spoil it for you. The stark prose and use of repetition in the wrong hands would be dull, but Harpman uses them to great effect in this unsettling meditation on the meaning of life and community, hope and hopelessness and the effects of captivity. But be warned: if you like your fiction to be tied up in a neat bow, then this isnt one for you. Eleanor ParsonsCloud of Poems by Cixin Liu (1997)Better known as Cixin Liu may be for his groundbreaking novels like The Three-Body Problem the first translated novel to win the Hugo Award for Best Novel he has also written many rich and rewarding short stories. Cloud of Poems, which features in his To Hold Up the Sky collection, is probably my favourite of them. In some ways, it feels like a drug-induced trip, as it playfully combines the hard science of a hollowed-out Earth with debate between an all-powerful god, a measly human and a space-travelling dinosaur about the relative benefits of poetry and technology. Like many other stories by Liu, while being nested in futuristic technology and advanced science, it incites you to consider the relationship between art and technology and how they relate to humanity, all in a tale imbued with the rich cultural history of China. Chris SimmsThe Man Who Ended History: A Documentary by Ken Liu (2011)Many time travel stories explore the implications of manipulating past events to shape the future. Ken Liu chooses to illustrate how the act of merely bearing witness to past events can prove disruptive to governments and societies that selectively engage with history through preferred narratives. Lius story features an Asian-American couple that is determined to use an experimental physics breakthrough to help individuals witness the second world war atrocities committed by Unit 731 an Imperial Japanese Army unit that performed deadly experiments on thousands of primarily Chinese civilians and developed biological weapons used on thousands more. The storys documentary-style format swiftly presents a variety of both emotionally charged and apathetic reactions to the controversial proposal, while highlighting how government-backed narratives that flatter national pride often omit inconvenient truths and flatten the complexities of the past. This is not easy reading various perspectives recount in unsparing, clinical detail how Unit 731s medical personnel committed sexual assault and performed vivisections on living people without anaesthesia. But Lius story feels incredibly relevant in grappling with thorny questions of how both individual and collective understandings of history continue to shape our present-day world. Jeremy HsuWelcome to Your Authentic Indian Experience TM by Rebecca Roanhorse (2017)Rebecca Roanhorses short story won both the Nebula and Hugo short story awards, and it is easy to see why. I could feel my stomach twisting in knots, combined with a sense of subtle dreadas the Native American protagonist of the story is befriended, abused and then replaced by a White Wolf. The parallels with both the modern and historical Native American experience are obvious.Jesse Turnblatt (the protagonist) is a Native American pod jockey who works at a tourist centre that offers Indian virtual-reality experiences for Tourists. These experiences range from the depraved to the banal. Seemingly uninspired at work, Jesse breaks protocol and befriends one of his customers. What follows is a not-so-subtle critique of the appropriation of Native American culture and, in my eyes at least, the appropriation of Native American land by white European settlers over the past few centuries. It is written from the second-person perspective, making the funny parts feel funnier and the depressing parts devastating. Finn GrantAll Systems Red by Martha Wells (2017)This is the novella where we first meet Murderbot, the security cyborg chasing irritably after freedom, self-knowledge and spare time to binge-watch media not necessarily in that order. I wrote about this series for New Scientists round-up of our favourite science fiction, and All Systems Red introduces many elements also found in the other books, including technology that melds organic beings with inorganic parts (and vice versa), snarky narration and criticism of corporate power. But this novella is crucial because in it, for the first time, Murderbot makes friends or as it would probably put it, gains teammates who see it as a full person worthy of respect and independence. And then it kills its way across an alien planet to protect them. Sophie BushwickThe Ones Who Stay and Fight by N. K. Jemisin (2020)It is the Day of Good Birds in Um-Helat, and everyone is happy. Among the floating skyscrapers and mica-flecked walls, children frolic wearing hand-made wings.Yes, another utopia, in conversation with Le Guins, with similar cadence and telescopic view. Jemisin directly acknowledges Omelas, tick of a city, fat and happy. This is not that.If Omelas feels flat, a mass of smiling sameness, Um-Helat is a utopia of explicit difference. Special drones help children with mobility impairments enjoy the same play as their peers. You may be unhoused if you like, and sleep under well-swept bridges. If you dwell in delusions, society keeps you safe but still free. We have race, but not racism. This is not that barbaric America, Jemisin, a Black woman, writes.Where Le Guin urges us to consider whether joy can be wise, Jemisin holds court on whether human variety can be untroubled by hatred. You, the cynical reader, are brought in to insist that wealth requires poverty; health, illness; beauty, ugliness. Maybe you cant imagine a world without homophobia, or any of the many scarcities we deal in. Jemisins city offers evidence to the contrary.And then in this story too comes the pause, the yes, but. If you have already read Le Guins work, you are waiting for it. But you will still be surprised. You will be invited to consider, and feel deeply conflicted. But maybe, youll stay. Christie TaylorLena by qntm (2021)In my view, the perfect sci-fi short story must have one idea, done extremely well, while also hinting at the larger implications of that idea on a wider world. Lena by qntm does just that, telling the story of the first copy of a human brain uploaded to a computer, and the subsequent consequences, in under 2000 words. Written in the form of a Wikipedia article, it describes how the digital brain has been repeatedly copied and put to work and the horrifying lessons researchers have learned. While Lena was written in 2021, just before the current AI boom, the methods needed to cajole the brain into working are strangely reminiscent of the prompts used to manipulate large language models like the one behind ChatGPT, though euphemisms like red motivation conceal a much darker reality. Even the storys title is masterfully chosen, named for a picture of Swedish model Lena Forsn published in Playboy magazine in the 1970s and since widely reproduced by computer science researchers as a test image, perhaps becoming one of the most duplicated images in history. Jacob AronTopics:0 Comments ·0 Shares ·60 Views
-
The Download: shaking up neural networks, and the rise of weight-loss drugswww.technologyreview.comThis is today's edition ofThe Download,our weekday newsletter that provides a daily dose of what's going on in the world of technology. The next generation of neural networks could live in hardware Networks programmed directly into computer chip hardware can identify images faster, and use much less energy, than the traditional neural networks that underpin most modern AI systems. Thats according to work presented at a leading machine learning conference in Vancouver last week. Neural networks, from GPT-4 to Stable Diffusion, are built by wiring together perceptrons, which are highly simplified simulations of the neurons in our brains. In very large numbers, perceptrons are powerful, but they also consume enormous volumes of energy. Part of the trouble is that perceptrons are just software abstractionsrunning a perceptron network on a GPU requires translating that network into the language of hardware, which takes time and energy. Building a network directly from hardware components does away with a lot of those costs. And one day, they could even be built directly into chips used in smartphones and other devices. Read the full story. Grace Huckins Drugs like Ozempic now make up 5% of prescriptions in the US Whats new? US doctors write billions of prescriptions each year. During 2024, though, one type of drug stood outwonder drugs known as GLP-1 agonists. As of September, one of every 20 prescriptions written for adults was for one of these drugs, according to the health data company Truveta. The big picture: According to the data, people who get prescriptions for these drugs are younger, whiter, and more likely to be female. In fact, women are twice as likely as men to get a prescription. Yet not everyone whos prescribed the drugs ends up taking them. In fact, half the new prescriptions for obesity are going unfilled. Read the full story. Antonio Regalado Why childhood vaccines are a public health success story Childhood vaccination is a success story. In the 50 years since the World Health Organization launched its ambitious global childhood vaccination program, vaccines are estimated to have averted 154 million deaths. That number includes 146 million children under the age of five. But concerns around vaccines endure. Especially, it seems, among the individuals Donald Trump has picked as his choices to lead US health agencies from January. So lets take a look at their claims, and where the evidence really stands on childhood vaccines. Read the full story. Jessica Hamzelou This story is from The Checkup, our weekly health and biotech newsletter. Sign up to receive it in your inbox every Thursday. The must-reads Ive combed the internet to find you todays most fun/important/scary/fascinating stories about technology. 1 Elon Musk is the shadow president of the United States The billionaire pressured Republicans into impeding a spending bill, despite lacking an official government role. (WP $)+ He posted about the bill more than 100 times on Wednesday alone (NBC News)+ but those posts were generally misleading or outright false. (Rolling Stone $)+ Lawmakers arent thrilled about Musks interference. (NYT $)2 Amazon workers are striking during the Christmas rush The walkouts could delay the delivery of parcels across the US. (WSJ $)+ Amazon is refusing to recognize the workers labor union. (WP $)3 The US is growing increasingly wary of Nvidias overseas sales spree Officials worry the chipmakers deals could end up empowering its adversaries. (NYT $)+ US-based venture firms have pledged to avoid taking funding from China. (WP $)+ Custom chipmaker Broadcoms stock is surging right now. (Insider $)4 Dozens of families are suing Snap over teen overdoses They allege Snapchat helped dealers to sell deadly counterfeit drugs to their children. (Bloomberg $)5 Ukraines drone footage will be used to train AI models The country has collected 228 years worth of data during its conflict with Russia. (Reuters)+ An overnight drone attack set fire to a refinery in south Russia. (Bloomberg $)+ Meet the radio-obsessed civilian shaping Ukraines drone defense. (MIT Technology Review)6 Jailbreaking AI models can be as simple as TyPiNg LiKe ThIsAnd the methods are simple to automate, too. (404 Media) + Text-to-image AI models can be tricked into generating disturbing images. (MIT Technology Review) 7 Indias answer to Silicon Valley is under immense pressureBengalurus rapid expansion is pushing the citys infrastructure to the absolute limit. (Insider $) + Indias gig economy is focusing on 10-minute deliveries. (Bloomberg $)+ How Indian health-care workers use WhatsApp to save pregnant women. (MIT Technology Review)8 Whats next for AI gadgets?Consumers werent overly enamored with them in 2024. (Fast Company $) 9 The man who claimed to have created bitcoin has been sentenced Craig Wright has been given a one-year suspended sentence after refusing to stop suing developers. (The Guardian)+ Hell face jail if he continues claiming he really is the mysterious Satoshi Nakamoto. (BBC)10 Online returns arent what they used to be Retailers are fed up, and so are customers. (The Atlantic $)Quote of the day You guys scared the life out of a lot of people. Geno, an Arizona resident, tells Amazon workers that their delivery drones are making his neighbors uneasy amid the drone panic gripping the US, the New York Times reports. The big story Bright LEDs could spell the end of dark skies August 2022 Scientists have known for years that light pollution is growing and can harm both humans and wildlife. In people, increased exposure to light at night disrupts sleep cycles and has been linked to cancer and cardiovascular disease, while wildlife suffers from interruption to their reproductive patterns, and increased danger. Astronomers, policymakers, and lighting professionals are all working to find ways to reduce light pollution. Many of them advocate installing light-emitting diodes, or LEDs, in outdoor fixtures such as city streetlights, mainly for their ability to direct light to a targeted area. But the high initial investment and durability of modern LEDs mean cities need to get the transition right the first time or potentially face decades of consequences. Read the full story. Shel Evergreen We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + How the Black diaspora celebrates Christmas across the world, featuring Motown tunes and a tasty saltfish salad.+ We love you Pamela Anderson!+ Test your science knowledge with this fiendish quiz of the year.+ Lets look ahead to just some of the exciting films coming out next year, from Bridget Jones to the bonkers-sounding Mickey 17.0 Comments ·0 Shares ·84 Views
-
Elon Musk downplayed his influence after Democrats started calling him 'President Musk'www.businessinsider.comElon Musk tried to play down his role in tanking a government spending bill on Thursday.Democrats have started calling him "President Musk," in a move likely to frustrate Donald Trump."Trump must absolutely hate the whole President Musk thing," one commentator said.Elon Musk has tried to downplay his influence in helping tank a government funding bill, and after Democrats started referring to him as "President Musk."In a series of X posts on Thursday night, Musk tried to distance himself from Democrats' claims that he is now the de facto leader of the Republican Party.Musk, who will co-lead the Department of Government Efficiency under President-elect Donald Trump,had criticized the first version of the spending bill earlier this week, calling for it to be "killed."A revised spending bill that he helped usher in then failed to get enough votes, potentially setting the stage for a government shutdown."Objectively, the vast majority of Republican House members voted for the spending bill, but only 2 Democrats did," Musk wrote in response. "Therefore, if the government shuts down, it is obviously the fault of @RepJeffries and the Democratic Party."Before the vote, Musk had posted: "First of all, I'm not the author of this proposal. Credit to @realDonaldTrump, @JDVance & @SpeakerJohnson."All but 38 House Republicans voted for the revised bill, but it fell short of the two-thirds majority required to extend government funding until March.Democrats seized on the opportunity to embarrass Trump by portraying him as a subordinate of Musk.Rep. Brendan Boyle of Pennsylvania said, "The leader of the GOP is Elon Musk," adding, "He's now calling the shots."Rep. Greg Casar of Texas asked if Musk was "kind of cosplaying co-President here," adding, "I don't know why Trump doesn't just hand him the Oval Office."Meanwhile, Rep. Rosa DeLauro of Connecticut, the top Democratic member on the House Appropriations Committee, said Republicans "got scared" because "President Musk said: 'Don't do it shut the government down.'"Others also weighed in."Welcome to the Elon Musk presidency," Rep. Robert Garcia of California said in a post on Thursday."It's clear who's in charge, and it's not President-elect Donald Trump," Rep. Pramila Jayapal of Washington added.After Thursday's vote, Musk reacted favorably to a post that said the reason Democrats keep saying "President" Elon Musk was to "drive a wedge" between him and Trump.Charlie Sykes, a political commentator and author of "How the Right Lost Its Mind," wrote that Musk had committed two cardinal sins: "upstaging" Trump and being responsible for an "embarrassing defeat.""Trump must absolutely hate the whole President Musk thing," he added.0 Comments ·0 Shares ·50 Views
-
Paramount greenlit another 'Sonic' movie as the latest is pacing to beat Disney's surprisingly weak 'Lion King' prequelwww.businessinsider.comParamount announced "Sonic the Hedgehog 4," just as the third movie is about to release."Sonic the Hedgehog 3" is projected to have a bigger domestic opening than "Mufasa: The Lion King."2019's "The Lion King" remake made $1.6 billion worldwide, but the prequel has far lower expectations.Paramount announced that a new Sonic movie is already in development ahead of the opening weekend for "Sonic the Hedgehog 3."The announcement came as revenue projections for "Sonic 3" came in surprisingly strong and ahead of Disney's anticipated "Mufasa: The Lion King."The franchise is based on the beloved Sega video games about the blue speedster, who has been starring in games for more than 30 years. Paramount brought Sonic into live-action in 2020's "Sonic the Hedgehog" movie and a 2022 sequel."Parks and Recreation" star Ben Schwartz voices Sonic in the movie franchise, and Jim Carrey plays his nemesis, Dr. Robotnik.Combined, the two films have made $707 million, according to TheNumbers.com. And Paramount seems to have faith in the third film as Variety reported that the studio has already greenlit "Sonic the Hedgehog 4" for Spring 2027 ahead of the threequel's release on December 21."Sonic the Hedgehog 3" is competing with the Disney prequel "Mufasa: The Lion King" over the holidays.The Hollywood Reporter estimated that "Sonic" would come out on top, with $60 million from its first weekend versus $50 million from "Mufasa."The outlet sources its estimates from theater chains and major analytics companies like Nielsen and Comscore, and is widely respected in the movie industry.The difference is a shock, given the historic power of the "Lion King" story. The 2019 remake of "The Lion King" raked in $1.6 billion worldwide, making it one of Disney's greatest financial successes."Sonic the Hedgehog 3" earned a healthy 87% critic rating on Rotten Tomatoes ahead of its release, while "Mufasa" got a "rotten" 54% rating.0 Comments ·0 Shares ·70 Views