• Epic Games insists that Apple flouted its App Store anti-steering injunction
    appleinsider.com
    Epic Games managed to force Apple into some App Store concessions with legal activity and campaigning, but it doesn't seem to have gone the way it planned. In a court filing Epic lays out how it believes Apple has gone wrong with its steering rule changes.Epic's parody of Apple's viral 1984 ad campaign at the start of its legal fight - Image credit: Epic GamesThe excruciatingly lengthy legal battles between Epic Games and Apple over In-App Purchasing and the App Store prompted a lot of outcry over how Apple conducts business with developers. With orders over anti-steering rules in an injunction, it seemed that Epic had gotten its way over some of its complaints to the court.According to a Post Hearing Findings of Fact filing with the U.S. District Court for the Northern District of California, Epic seemingly did not. The filing is effectively a 41-page complaint from Epic to the court, insisting that Apple has to comply more with injunctions set out in its main U.S. lawsuit. Continue Reading on AppleInsider | Discuss on our Forums
    0 Comments ·0 Shares ·19 Views
  • AMD releases Capsaicin 1.2
    www.cgchannel.com
    Saturday, March 8th, 2025Posted by Jim ThackerAMD releases Capsaicin 1.2html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd"AMD has released Capsaicin 1.2, the latest version of its open-source framework for developing real-time rendering technologies.The update adds support for morph target animation, meshlet-based rendering, ACES tonemapping, and the .dds texture format.A GPU-agnostic, modular, open-source framework for developing real-time rendering techFirst released publicly in 2023, Capsaicin is a modular open-source framework for prototyping and developing real-time rendering technologies, primarily for games.It is designed for developing in broad strokes, creating simple, performant abstractions, not low-level hardware implementations, and is not intended for tuning high-performance tools.The framework is intended for developing Windows applications, but is GPU-agnostic, requiring a card that supports DirectX 12/Direct3D 12 and DXR (DirectX Raytracing) 1.1.AMD has used it in the development of its own rendering technologies, including an implementation of GI-1.0, its real-time global illumination algorithm.As well as the GI renderer, the framework includes a reference path tracer.Other features include readymade components for Temporal Anti-Aliasing (TAA), Screen-Space Global Illumination (SSGI), light sampling, tonemapping, and loading glTF files.The framework also includes HLSL shader functions for sampling materials and lights, spherical harmonics, and common mathematical operations, including random number generation.Capsaicin 1.2: blendshape animation, meshlet rendering and ACES tonemappingTo that, Capsaicin 1.2 adds support for rendering morph target (blendshape-based) animations, in addition to its existing support for skinned characters.The update also adds support for meshlet-based rendering, for streaming and decompressing high-resolution geometry at render time, in a way similar to UE5s Nanite system.Other new features include support for the .dds texture file format, used in games including Elden Ring and GTA V, and bloom and lens effects from AMDs FidelityFX toolkit.The update also adds a range of new tonemappers: the framework now defaults to ACES tonemapping, with support for Reinhard, Uncharted2, PBR Neutral and AgX, the latter now supported in Blender, Godot and Marmoset Toolbag.Licensing and system requirementsThe Capsaicin source code is available under an open-source MIT license. It can be compiled for Windows 10+ only, and requires a GPU that supports Direct3D 12 and DXR 1.1.Compiling from source requires Visual Studio 2019+ and CMake 3.10+. Find instructions here.Read more about the new features in Capsaicin 1.2 on AMDs blogRead more about the Capsaicin framework on AMDs GPU Open websiteHave your say on this story by following CG Channel on Facebook, Instagram and X (formerly Twitter). As well as being able to comment on stories, followers of our social media accounts can see videos we dont post on the site itself, including making-ofs for the latest VFX movies, animations, games cinematics and motion graphics projects.Latest NewsAMD releases Capsaicin 1.2Check out the new features in the modular, open-source, hardware-agnostic framework for developing real-time rendering technology.Saturday, March 8th, 2025AMD launches FSR 4, AFMF 2.1 and RIS 2.0Check out the new versions of AMD's AI-based image upscaling, sharpening and frame-generation tech, available via its GPU drivers.Saturday, March 8th, 2025AMD launches Radeon RX 9070 XT and Radeon RX 9070First RDNA 4 GPUs get strong reviews in the gaming press. But how do they fare in CG apps like Blender and DaVinci Resolve?Friday, March 7th, 2025D5 Render 2.10 adds real-time path tracingMajor update to Dimension 5's real-time visualization software adds experimental path tracing system. Check out the other new features.Thursday, March 6th, 2025Download Marmoset's free materials for Toolbag 5Library Drop 05 makes over 150 new assets available for the real-time rendering and look dev software, including 86 new materials.Wednesday, March 5th, 2025Autodesk lays off 1,350 staffSo why is the Maya and 3ds Max developer culling 9% of its global workforce when revenue is actually going up across the business?Tuesday, March 4th, 2025More News3d-io releases Unwrella-IO5 key features for CG artists in Godot 4.4Trimble releases SketchUp 2025.0Tutorial: Dynamic Cloth Simulation for ProductionCheck out free Blender scattering add-on OpenScatterCETA Software launches Artist AccessFoundry releases Nuke 16.0Boris FX releases SynthEyes 2025Adobe launches Photoshop on iPhonePlastic Software releases Plasticity 2025.1Technicolor Group begins to shut down operationsDownload four free VDB clouds from VFX AssetsOlder Posts
    0 Comments ·0 Shares ·19 Views
  • What you need to know about Manus, the new AI agentic system from China hailed as a second DeepSeek moment
    venturebeat.com
    Manus AI is designed as a multi-agent system, meaning it combines several AI models to handle tasks independently.Read More
    0 Comments ·0 Shares ·4 Views
  • Reality Games raises $4.7M for location-based Monopoly World
    venturebeat.com
    Reality Games has closed a $4.7 million seed round to accelerate the development of its location-based real-world Monopoly game.Read More
    0 Comments ·0 Shares ·7 Views
  • Split Fiction sells 1 million copies in two days
    www.gamedeveloper.com
    Justin Carter, Contributing EditorMarch 10, 20252 Min ReadImage via Hazelight Studios/EA.At a GlanceSplit Fiction's 1 million copies sold marks another win for Hazelight. It's the studio's fastest-selling game to date.Hazelight Studios has announced that its latest co-op adventure Split Fiction sold 1 million copies within 48 hours of launch.The co-op action-adventure title released on March 6 for PlayStation 5, Xbox Series X|S, and PC. It's the fastest-selling game to date from the studio: 2021's It Takes Two and 2018's A Way Out took two weeks after their respective launches to reach that same milestone."The love you all show for our game is overwhelming!" wrote Hazelight. "Everyone here at Hazelight are beyond happy - and we cant stop enjoying your amazing reactions!"Hazelight's games have gotten strong reviews and sold well. In a recent Washington Post interview, creative director Josef Fares revealed It Takes Two has sold 23 million copies, and A Way Out has sold 11 million copies to date.The studio is also the best-performing (and repeat) developer of EA's Originals label, which has also published Surgent Studios' Tales of Kenzera: Zau and Omega Force's Wild Hearts.Player-wise, Split Fiction had a strong launch on Steam. By Saturday, March 8, it had 254,756 concurrent players on Valve's platform. That peak has now risen to 259,003 players at time of writing. Some of the game's popularity can be attributed to its Friends Pass system, which lets an owner share the game with a friend who doesn't own it so they can play together from start to finish.It's the highest-performing Hazelight game on Steam to date, surpassing peaks for It Takes Two (71,039 players) and A Way Out (8,582 players) by a substantial amount.Hazelight and Split Fiction keep couch co-op aliveWhy are Hazelight's games doing so well? Co-op seems to be the answer.Cooperative play is nothing new, but the studio's narratives and gameplay are built around the dynamics between its two leads, rather than just having player two be another person to have on hand. The split-screen focus also makes it an ideal game to play with family or a significant other, especially if the game is about such interpersonal dynamics.Couch co-op is not a priority for mainstream titles. Hazelight's continued focus on it games makes the studio a unique presence that helps gives its games some more longevity. If the servers for It Takes Two ever go down, the game isn't completely dead, as is the case with other online-centric multiplayer titles.It also helps that the studio's games have been released with inviting other players in mind. Split Fiction aforementioned Friends Pass comes with cross-play, which help opens things up significantly.Read more about:EAAbout the AuthorJustin CarterContributing Editor, GameDeveloper.comA Kansas City, MO native, Justin Carter has written for numerous sites including IGN, Polygon, and SyFy Wire. In addition to Game Developer, his writing can be found at io9 over on Gizmodo. Don't ask him about how much gum he's had, because the answer will be more than he's willing to admit.See more from Justin CarterDaily news, dev blogs, and stories from Game Developer straight to your inboxStay UpdatedYou May Also Like
    0 Comments ·0 Shares ·17 Views
  • Infinity Nikki’s new interactive map makes it way easier to find what you need
    www.theverge.com
    The developers of Infinity Nikki have finally solved one of the games biggest problems: where the hell is everything? Piggybacking off similar efforts in other gacha games like Genshin Impact, the developers at Infold Games have created an interactive map thatll guide you to all the bits and bobs littered throughout the games impressively expansive world.Because Infinity Nikki is a gacha game, there are fifty leven different kinds of things to collect and trade for resources, currencies, and clothes. You can map some of the collectibles in Infinity Nikki to the ones in The Legend of Zelda: Breath of the Wild and it works out pretty well. Some of them are easier to find than others. The whimstars are like spirit orbs you earn from shrines and theyre pretty easy to find as the game has a built-in system to help you locate them. The dews of inspiration are like korok seeds; theres ten and a half million of them and your best chance of finding them is through sheer luck and determination i.e. impossible. It kinda sucks that theyre so hard to find because dews can be exchanged for really cool outfits that can help out if youre stuck in one of the styling boss battles. The Infinity Nikki community has clamored for the developers to put a tracking system for them in the game similar to the way your bi-pedal cat companion Momo can guide you to nearby whimstars. The developers have fulfilled other community wishes like expanding the number of outfit slots from four to seven (thanks!) and now theyve answered this ask in a big way.Heres the map in action. Image: Infold GamesVisiting this site will grant access to an interactive map whereby you can finally locate all those damnable dews of inspiration. If you use your Infinity Nikki log in information, the map will be personalized to show what items youve picked up and what youre still missing. Best yet, this map can be used for literally everything missing treasure chests, secret clothing vendors, hidden caves, and more. Its not as convenient as an in-game map would be but it updates in real time to make checking things off your list a little easier.See More:
    0 Comments ·0 Shares ·8 Views
  • Sony will give the PS5 Pro crisper graphics by backporting FSR 4
    www.theverge.com
    Today, the $700 PlayStation 5 Pro can already produce crisper, sharper, smoother, and more stable graphics than a PS5, if you sit close enough to appreciate them. But starting in 2026, the company hopes to imbue games with a new AI upscaling formula thatll make them even crisper, based on the AMD FSR 4 technique thats now shipping with AMDs new RX 9070 and RX 9070 XT graphics cards and appears to be competitive against Nvidias latest DLSS as well.Our target is to have something very similar to FSR 4s upscaler available on PS5 Pro for 2026 titles as the next evolution of PSSR, PlayStation lead architect Mark Cerny tells Digital Foundry.Many new PS5 Pro games already use an AI upscaler called PlayStation Spectral Super Resolution on their AMD-based chips. PSSR can turn 720p images into 4K ones on the fly while adding extra particle effects, and weve been relatively impressed by its quality compared to, say, FSR 3.But Sony and AMDs relationship didnt stop there it turns out that the neural network that underpins AMDs new FSR 4 upscaler was part of a collaboration with Sony, too, and Sony now plans to backport some of that work to its flagship PlayStation.In December, Sony and AMD announced a multi-year collaboration called Project Amethyst, one that were now learning actually began back in 2023, and Cerny tells Digital Foundry that FSR 4 was the first fruit of those labors. The neural network (and training recipe) in FSR 4s upscaler are the first results of the Amethyst collaboration, he says, calling it a more advanced approach that can exceed the crispness of PSSR.For now, though, Cerny says its still encouraging developers to use PSSR, as it will take time to reimplement FSR 4s upscaling network into the PS5 Pro and its games.Cerny suggests that Sony will have its own implementations of each algorithm it develops with AMD from here on out. He also hints Project Amethyst may be about more than home consoles, though: Now to be clear, this technology has uses beyond PlayStation, and its about supporting broad work in machine learning across a variety of devices the biggest win is when developers can freely move their code from device to device, he tells Digital Foundry.You can watch Digital Foundrys comparison of FSR 4, and its deep dive on PSSR, below.See More:
    0 Comments ·0 Shares ·16 Views
  • Nuclei Detection and Fluorescence Quantification in Python: A Step-by-Step Guide (Part 2)
    towardsai.net
    Author(s): MicroBioscopicData (by Alexandros Athanasopoulos) Originally published on Towards AI. Welcome back to the second tutorial in our series, Nuclei Detection and Fluorescence Quantification in Python. In this tutorial, we will focus on measuring the fluorescence intensity from the GFP channel, extracting relevant data, and performing a detailed analysis to derive meaningful biological insights.To fully benefit from this tutorial, its helpful to have a basic understanding of Python programming as well as some familiarity with fluorescence microscopy, including the principles behind using fluorescent proteins like GFP (Green Fluorescent Protein).In the previous tutorial, we used images of fibroblast cells where the nuclei are labeled with DAPI, a fluorescent dye (blue channel) that binds to DNA, and a protein of interest that is present in both the cytoplasm and nucleus, detected in the green channel. We began by preprocessing the images to enhance data quality. We applied Gaussian smoothing with varying sigma values to reduce noise and used thresholding methods to effectively distinguish the nuclei from the background. Additionally, we discussed post-processing techniques, such as removing small artifacts, to further refine the segmentation results.The code below (from our first tutorial) effectively segments and visualizes nuclei in fluorescence microscopy images, offering clear insights into the distribution and intensity of the detected features. The next step in fluorescence quantification is to label the segmented nuclei.from skimage import io, filters, morphology, measure, segmentation, colorfrom skimage.measure import regionpropsimport numpy as npimport matplotlib.pyplot as pltimport pandas as pdimport seaborn as sns# Set option to display all columns and rows in Pandas DataFramespd.set_option('display.max_columns', None)pd.set_option('display.max_rows', None)# Load the multi-channel TIFF imageimage = io.imread('fibro_nuclei.tif')# Separate the GFP channel (assuming channel 0 is GFP)channel1 = image[:, 0, :, :] # GFP channel# Perform Maximum Intensity Projection (MIP) on GFP channelchannel1_max_projection = np.max(channel1, axis=0) # Separate the DAPI channel (assuming channel 1 is DAPI)channel2 = image[:, 1, :, :] # DAPI channel# Perform Maximum Intensity Projection (MIP) on DAPI channelchannel2_max_projection = np.max(channel2, axis=0) # Apply Gaussian smoothing to the DAPI MIPsmoothed_image = filters.gaussian(channel2_max_projection, sigma=5)# Apply Otsu's method to find the optimal threshold and create a binary maskthreshold_value = filters.threshold_otsu(smoothed_image)binary_mask = smoothed_image > threshold_value# Create subplots with shared x-axis and y-axisfig, (ax1, ax2) = plt.subplots(1, 2, sharex=True, sharey=True, figsize=(10, 10))# Visualize the Maximum Intensity Projection (MIP) for the DAPI channelax1.imshow(channel2_max_projection, cmap='gray')ax1.set_title('Maximum Intensity Projection (DAPI Channel)')# Visualize the binary mask obtained after thresholding the smoothed DAPI MIPax2.imshow(binary_mask, cmap='gray')ax2.set_title('Binary Mask (After Thresholding)')# Adjust layout to prevent overlapplt.tight_layout()# Display the plotsplt.show()Left Panel: This image shows the Maximum Intensity Projection (MIP) of the DAPI channel, which highlights the nuclei stained with DAPI (a blue fluorescent dye). Right Panel: This panel displays the binary mask generated after applying Otsus thresholding to the DAPI channel.Labeling the Segmented NucleiLabeling the binary mask is a crucial step in image analysis. When we perform thresholding on an image, the result is a binary mask (see also our previous tutorial) where pixels are classified as either foreground/True (e.g., nuclei) or background/False. However, this binary mask alone doesnt distinguish between different individual nuclei it simply shows which pixels belong to the foreground and to the background.Labeling is the process of assigning a unique identifier (label) to each nucleus in the binary mask. In the context of connected components, labeling involves identifying and marking groups of connected pixels (components) that represent individual objects, such as nuclei, in the image. Once the binary mask is created, the connected components algorithm is applied. This algorithm scans the binary mask to detect groups of connected pixels using either 4-connectivity or 8-connectivity criteria (see below the image) and assigns a unique label to each connected component. Each label corresponds to a distinct nucleus in the image [1].There are different types of connectivity, primarily 4-connectivity and 8-connectivity:4-Connectivity:Definition: In 4-connectivity, a pixel (of interest) is considered connected to another pixel if they share an edge. In a 2D grid, each pixel has four possible neighbors: left, right, above, and below.Applications: 4-connectivity is often used in algorithms where diagonal connections are not considered, thus providing a more restrictive form of connectivity.8-Connectivity:Definition: In 8-connectivity, a pixel (of interest) is connected to all of its neighbors, including those that share a vertex. This means that, in addition to the four edge-connected neighbors (as in 4-connectivity), the pixel is also connected to the four diagonal neighbors.Applications: 8-connectivity is used in applications where diagonal connections are significant, providing a more inclusive form of connectivity.Left Panel: In 4-connectivity, the pixel of interest (highlighted in red) is connected to its four direct neighbors (up, down, left, and right), which are shown in blue. Right Panel: In 8-connectivity, the pixel of interest (highlighted in red) is connected to its eight surrounding neighbors (up, down, left, right, and diagonals), which are shown in blue.Why Labeling is ImportantIdentification: Labeling allows us to identify and differentiate between individual nuclei within the binary mask. Each nucleus has a unique label, which makes it possible to treat and analyze each nucleus separately.Analysis: Once the nuclei are labeled, we can measure various properties of each nucleus individually, such as area, perimeter, and fluorescence intensity This is essential for quantitative analysis in biological research.Visualization: Labeling also facilitates the visualization of segmented nuclei. By assigning different colors or intensities to each label, we can easily see and distinguish the segmented nuclei in a labeled image.The code below is used to label connected regions (components) in our binary image. The function skimage.measure.label scans the binary mask and assigns a unique integer label to each connected component. The output is a labeled image (2D numpy array) where each connected component is assigned a unique integer label (e.g., 1, 2, 3, etc.). Pixels that belong to the same component (e.g., a single nucleus) will have the same label. By default, the function uses 8-connectivity.The function color.label2rgb(labeled_nuclei, bg_label=0) from the skimage.color module converts a labeled image into an RGB (color) image.labeled_nuclei: This is the labeled imagebg_label=0: This specifies that the background label is 0, so the background will not be colored, and only the labeled regions (nuclei) will be colored differently in the output RGB image.The segmentation.clear_border() function is used next to remove any nuclei that touch the edges of the image, ensuring that only fully contained nuclei are considered. The image is then relabeled to reflect the removal of these border-touching nuclei, and the updated count is printed. Finally, the labeled nuclei are visualized in color, with each nucleus annotated at its centroid using its corresponding label number.# Label the nuclei and return the number of labeled componentslabeled_nuclei, num_nuclei = measure.label(binary_mask, return_num=True)print(f"Initial number of labeled nuclei: {num_nuclei}")# Remove nuclei that touch the borderscleared_labels = segmentation.clear_border(labeled_nuclei)# Recalculate the number of labeled nuclei after clearing the borders# Note: We need to exclude the background (label 0)final_labels, final_num_nuclei = measure.label(cleared_labels > 0, return_num=True)print(f"Number of labeled nuclei after clearing borders: {final_num_nuclei}")# Visualize the labeled nucleiplt.figure(figsize=(10, 10))plt.imshow(color.label2rgb(final_labels, bg_label=0), cmap='nipy_spectral')plt.title('Labeled Nuclei')plt.axis('off')# Annotate each nucleus with its labelfor region in measure.regionprops(final_labels): # Take the centroid of the region and use it for placing the label y, x = region.centroid plt.text(x, y+30, f"Nucleus: {region.label}", color='white', fontsize=12, ha='center', va='center')plt.show()Initial number of labeled nuclei: 19Number of labeled nuclei after clearing borders: 15This image displays the labeled nuclei after segmentation. Each nucleus is assigned a unique label, represented by a different color and annotated with its corresponding label number (e.g., Nucleus: 1, Nucleus: 2). The labeled regions correspond to individual nuclei, allowing for further analysis, such as quantifying fluorescence intensity or calculating various morphological properties. The black background represents the area that does not contain any nuclei, while the colored regions are the segmented and labeled nuclei.Left Panel: Maximum Intensity Projection (MIP) of the DAPI channel, highlighting the nuclei stained with a fluorescent dye that binds to DNA. The red contours indicate the boundaries of the segmented nuclei based on thresholding and image analysis. Right Panel: The summed intensity of the GFP channel, which detects the protein of interest in the sample. The red contours represent the same segmented nuclei from the DAPI channel, overlaid to show the corresponding locations of the nuclei within the GFP channel.Measure fluorescenceTo measure the fluorescence in the green channel (GFP) of our multi-channel Z-stack image, we sum the pixel values of the GFP channel within the regions defined by our binary mask, instead of relying solely on the maximum intensity projection.This method (sum the pixel values) provides a better representation of the total fluorescence signal within each labeled region (nucleus) because it accounts for the entire intensity distribution rather than just the brightest pixels.The code below calculates the total GFP fluorescence for each labeled nucleus in the image by summing the pixel intensities in the GFP channel. The resulting values are stored in a list for further analysis, such as comparing fluorescence across different nuclei or assessing the distribution of GFP within the sample. The operation channel1.sum(axis=0) sums the pixel intensities across all Z-slices for each (x, y) position in the image. This results in a 2D image where each pixel value represents the total fluorescence intensity at that (x, y) coordinate across the entire depth of the sample.# Sum fluorescence in GFP channel within each labeled nucleusgfp_fluorescence = []for region in measure.regionprops(final_labels, intensity_image=channel1.sum(axis=0)): # channel1.sum(axis=0) has a data type of 64-bit unsigned integer gfp_sum = region.intensity_image.sum() gfp_fluorescence.append(gfp_sum)# Print the total fluorescence for each nucleusfor i, fluorescence in enumerate(gfp_fluorescence, start=1): print(f"Nucleus {i}: Total GFP Fluorescence = {fluorescence}")Nucleus 1: Total GFP Fluorescence = 80250Nucleus 2: Total GFP Fluorescence = 164085Nucleus 3: Total GFP Fluorescence = 490688Nucleus 4: Total GFP Fluorescence = 241095Nucleus 5: Total GFP Fluorescence = 174400Nucleus 6: Total GFP Fluorescence = 373265Nucleus 7: Total GFP Fluorescence = 384270Nucleus 8: Total GFP Fluorescence = 657477Nucleus 9: Total GFP Fluorescence = 484203Nucleus 10: Total GFP Fluorescence = 390793Nucleus 11: Total GFP Fluorescence = 430493Nucleus 12: Total GFP Fluorescence = 438093Nucleus 13: Total GFP Fluorescence = 402420Nucleus 14: Total GFP Fluorescence = 387462Nucleus 15: Total GFP Fluorescence = 513172Data AnalysisThe code above practcially calculated the integrated density which is a measure used in image analysis to quantify the amount of signal (e.g., fluorescence) within a region of interest (such as a nucleus).In fluorescence microscopy, integrated density can be used to estimate the total amount of fluorescence in a given nucleus or cellular compartment. This can be useful for comparing the expression levels of a fluorescently labeled protein between different cells or experimental conditions.The code below converts the gfp_fluorescence list into a pandas DataFrame for further statistical analysis, such as comparing fluorescence across different nuclei or conditions, calculating mean and standard deviation, or performing more advanced analyses like clustering or correlation studies.# Convert the fluorescence data into a DataFramedf = pd.DataFrame({'Nucleus': range(1, len(gfp_fluorescence) + 1), 'GFP_Fluorescence': gfp_fluorescence})# Display the DataFramedfBy analyzing the distribution of fluorescence intensity across the nuclei, we can potentially reveal the presence of different populations or subgroups within the sample. This analysis could provide valuable insights, such as identifying distinct expression patterns or responses to treatment. Techniques like clustering can help in categorizing the nuclei based on their fluorescence profiles, enabling deeper biological interpretations.# Plot histogramplt.figure(figsize=(10, 6))sns.histplot(df['GFP_Fluorescence'], bins=20, kde=True)plt.title('Histogram of GFP Fluorescence Intensity')plt.xlabel('GFP Fluorescence Intensity')plt.ylabel('Frequency')plt.show()This figure shows the distribution of GFP fluorescence intensity across different nuclei in the sample. The x-axis represents the GFP fluorescence intensity, and the y-axis represents the frequency. The blue bars show the number of nuclei falling into each intensity range, and the blue line is a kernel density estimate (KDE) that provides a smoothed curve to represent the underlying distribution.Clustering Analysis:We can apply K-means clustering to group the nuclei based on their fluorescence intensity. This can help identify distinct populations that differ in their expression levels. In the scatter plot below each point represents a nucleus, with the x-axis showing the nucleus index and the y-axis showing the total GFP fluorescence intensity for that nucleus. The points are color-coded based on the cluster they belong to. Two clusters are represented: cluster 0 (in green) and cluster 1 (in orange). The clustering was performed using K-means with two clusters. This plot demonstrates how nuclei can be grouped into distinct clusters based on their GFP fluorescence intensity.from sklearn.cluster import KMeans# Reshape data for clusteringfluorescence_data = df['GFP_Fluorescence'].values.reshape(-1, 1)# Apply K-means clustering (let's assume 2 clusters for simplicity)kmeans = KMeans(n_clusters=2, random_state=0).fit(fluorescence_data)df['Cluster'] = kmeans.labels_# Visualize clustersplt.figure(figsize=(10, 6))sns.scatterplot(x=df.index, y=df['GFP_Fluorescence'], hue=df['Cluster'], palette='Set2')plt.title('K-means Clustering of GFP Fluorescence Intensity')plt.xlabel('Nucleus')plt.ylabel('GFP Fluorescence Intensity')plt.show()Together, these plots (histogram and scatter plot) indicate the presence of at least two subpopulations of nuclei based on their GFP fluorescence, potentially reflecting biological variability or different conditions affecting fluorescence expression.ConclusionIn this tutorial, we explored advanced image processing techniques for segmenting nuclei and quantifying fluorescent signals using Python. By employing methods like Gaussian smoothing, thresholding, and connected component labeling, we were able to accurately identify and separate individual nuclei in the DAPI channel. We also demonstrated how to measure fluorescence intensity in the GFP channel by summing pixel values across Z-slices to capture the full distribution of fluorescence in each nucleus. Through data analysis, we were able to quantify and interpret the fluorescence signals, enabling deeper insights into biological variations.References:[1] P. Bankhead, Introduction to Bioimage Analysis Introduction to Bioimage Analysis. https://bioimagebook.github.io/index.html (accessed Jun. 29, 2023).Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming asponsor. Published via Towards AI
    0 Comments ·0 Shares ·5 Views
  • Implementing Lag or Lead Values (Only For Numeric Data) without Using the Lag() or Lead() Window Functions
    towardsai.net
    LatestMachine LearningImplementing Lag or Lead Values (Only For Numeric Data) without Using the Lag() or Lead() Window Functions 0 like March 10, 2025Share this postLast Updated on March 10, 2025 by Editorial TeamAuthor(s): Kamireddy Mahendra Originally published on Towards AI. The concept of Range of Records In SQL With Sum or AverageThis member-only story is on us. Upgrade to access all of Medium.Image by authorWindow functions will help us in different ways to find our required data in a few lines of SQL Queries.Without using window functions, to return our required data, we might be using joins, subqueries, or CTEs, which will make a query so complex.For example,Given employees table with several details. Return those employees whose salary is higher or lower than the average of all employees in the entire organization or each departments average salary.We can solve this type of problem using window functions easily. Otherwise, we should use joins and subqueries or CTEs. A bit complex. Agree?Some window functions will allow us to take any range of records while calculating any response we want using the window functions.Lets see how we can do that.Assume we have a sales details table. Now we need to calculate the running total sales for successive three days.Yes, we can use window functions by considering the range of records to find this type of response.A Sample code is mentioned in the code block.select *, sum(sale_amount) over(order by sale_date rows between 1 preceding and 1 following)from sales_tableIf you observe in the above code block, Read the full blog for free on Medium.Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming asponsor. Published via Towards AITowards AI - Medium Share this post
    0 Comments ·0 Shares ·6 Views
  • Neil Druckmann Says Theres a 'Dramatic Reason' Spores Are Back for The Last of Us Season 2
    www.ign.com
    The Last of Us Season 2 showrunners Neil Druckmann and Craig Mazin have confirmed spores are back following their omission from Season 1.The latest trailer for the upcoming HBO show, below, teased the introduction of spores after they were left out of Season 1. We see Ellie, played by Bella Ramsey, view an infected whose breath releases the spores.Warning! Spoilers for The Last of Us Season 1 follow.In the video games, the cordyceps infection spreads via spores, which forces the characters to wear a gas mask at points. But for Season 1 of the TV show, spores were ditched in favor of tendrils, whose menace certainly appears more horrifying and cinematic. In Season 1, we see clickers sprout terrifying tendrils that simulate the way mushrooms communicate with each other through a root-like network.There was a practical reason for the change, which is that using spores in the show would mean big-name stars like Pedro Pascal would be forced to wear a gasmask every other scene. While this works on The Mandalorian thanks to his cool helmet, a lot of the tension would be lost in The Last of Us, and tendrils are also way more menacing, visually speaking.Its disturbing and its violative. I think its very primal in the way it invades your own body, Mazin said in an interview with Variety about the decision to swap spores for tendrils, particularly in the final scene where Tess receives what can only be described as the worlds worst goodbye kiss.The latest The Last of Us Season 2 trailer confirms the return of spores. Image credit: HBO Max.Now, speaking at SXSW 2025, Druckmann confirmed there is "an escalation of numbers and types of infected, but also, as you see in the trailer, an escalation of the vector of how this thing spreads" in The Last of Us Season 2, adding: "Season 1, we had this new thing that wasnt in the game of these tendrils that spread, and that was one form. And then one shot you see in this trailer, there are things in the air."Mazin later confirmed "spores are back," before Druckmann added: "The reason [were doing it now], I mean, we really wanted to figure it out, and again, everything has to be drama. There had to be a dramatic reason of introducing it now. And there is."The Last of Us Season 2 Character PostersIn other The Last of Us Season 2 news, actress Kaitlyn Dever discussed playing Abby, admitting she finds it hard to stop herself from looking at the reaction on the internet.HBO's plans to extend The Last of Us Part 2 beyond a single season, unlike the critically acclaimed Season 1 which covered the entirety of the original game. Mazin has said previously that Part 2 features a lot more story to cover, so while Season 3 isn't greenlit as of yet, they've built Season 2 with a "natural breakpoint" after just seven episodes.Image credit: HBO Max.Wesley is the UK News Editor for IGN. Find him on Twitter at @wyp100. You can reach Wesley at wesley_yinpoole@ign.com or confidentially at wyp100@proton.me.
    0 Comments ·0 Shares ·7 Views