• Alpenglow Community Park / Fieldwork Design and Architecture
    www.archdaily.com
    Alpenglow Community Park / Fieldwork Design and ArchitectureSave this picture! Chris MurrayPortland, United StatesArchitects: Fieldwork Design and ArchitectureAreaArea of this architecture projectArea:4475 mYearCompletion year of this architecture project Year: 2023 PhotographsPhotographs:Chris MurrayManufacturersBrands with products used in this architecture project Manufacturers: Custom Rock, Evonik Industries, McNichols, Taylor Metal Products More SpecsLess SpecsSave this picture!Text description provided by the architects. Alpenglow Community Park, a newly developed 37-acre site in Bend, Oregon, showcases the natural beauty of the high desert landscape. The park's name and architecture are inspired by the volcanic peaks of the Three Sisters and the Cascade Mountain Range, which dominate the western view. The project features three structures: an event pavilion, a picnic/restroom shelter, and a pedestrian bridge. The material palette integrates weathering steel, ribbed concrete, and locally sourced Ponderosa Pine, drawing inspiration from the site's native vegetation, colors, and basalt rock formations.Save this picture!Save this picture!Save this picture!The event pavilion anchors the park's center, alongside the picnic and restroom shelter. The angular design of both structures mimics the nearby mountain range's silhouette, with rooflines that frame views within the park and beyond while folding to provide shade and shelter. The columns' geometry reflects the pattern of the surrounding forest. Ribbed concrete walls, inspired by local basalt formations, provide texture and continuity, while ceilings clad in Ponderosa Pine add warmth. Both structures operate off the grid, with solar panels providing electricity as needed. By grounding the architecture in the high desert landscape, the material palette connects each structure to its surroundings.Save this picture!Save this picture!The pedestrian bridge, which connects the park to the neighborhood to the west, spans an active BNSF railroad line west of the event pavilion. Weathering steel is used throughout, with perforated panels employed over a tube frame to comply with BNSF's fence protection standards while maintaining transparency and capturing views of the surrounding landscape.Save this picture!Save this picture!Save this picture!The structures at Alpenglow are designed to work in concert with the concepts and functions of the park, creating a place that will be meaningful to the community for generations. An integrated design process allowed the architects, landscape architects, and engineers to collaborate from the masterplan scale down to the smallest design detail. The result is a park that connects the surrounding community with the natural beauty of the site and landscape.Save this picture!Save this picture!The design of the park's structures is driven by the functional needs and requests of the surrounding community. By soliciting community feedback on other local parks, as well as early concept iterations, the designers were able to tailor the structures to the experiences the users were seeking.Save this picture!Save this picture!Save this picture!Although the park is located in a rapidly developing portion of Bend and is served by the surrounding infrastructure, the design team was challenged to provide park structures that function without being connected to the adjacent electrical system. Through sun studies and close collaboration with the engineering teams, the architects were able to design structures that generate 100% of their electricity through roof-mounted solar panels.Save this picture!The park is designed to provide connections between people and nature and to be an inclusive gathering place for everyone in the community. Amenities throughout the park encourage exploration of the native landscape and promote healthy activities. The park structures reinforce these values, as well as provide places for engagement and connection between community members.Save this picture!Project gallerySee allShow lessAbout this officePublished on February 24, 2025Cite: "Alpenglow Community Park / Fieldwork Design and Architecture" 24 Feb 2025. ArchDaily. Accessed . <https://www.archdaily.com/1027195/alpenglow-community-park-fieldwork-design-and-architecture&gt ISSN 0719-8884Save!ArchDaily?You've started following your first account!Did you know?You'll now receive updates based on what you follow! Personalize your stream and start following your favorite authors, offices and users.Go to my stream
    0 Comments ·0 Shares ·51 Views
  • Intels Modular Concept: Why Its Time To Rethink Laptop Design
    www.technewsworld.com
    While components in most desktop computers can be easily well, relatively easily replaced, laptops tend to be far more difficult to upgrade or replace. Intel is pushing a modular concept that could make upgrading your laptop less expensive than buying a new one and substantially reduce e-waste.A few years ago, Dell introduced Concept Luna, a laptop design featuring modular components that could be upgraded through a robotic upgrade station at a retailer. Similar to a vending machine, the system allowed a robot to handle the upgrade process.Lets talk about modular computers that started with the IBM PCjr, moved to an IBM modular computer concept (I was approached to be CEO for one of the potential spin-off companies), the Dell effort, and finally, the Intel concept this week.Well close with my Product of the Week, a brand-new laptop that could be ideal for an executive or someone interested in a lot of performance in a small, quiet form factor.A Brief History of Modular PCsThe IBM PCjr was a revolutionary desktop design based on the ability to update the product easily. Updated components were packaged kind of like game cartridges in that you buy the basic system and then upgrade it by purchasing and adding components that were covered in plastic and could be plugged into the base unit.It was an amazing design until the folks doing the more expensive IBM PC figured out it was so much better and cheaper that folks would likely prefer it over IBMs more expensive, non-modular products. So, they crippled the PCjr by removing a lot of what would have made it attractive for business (the PCjr was positioned as a consumer PC). While making it unattractive for business, they also made it unappealing for the rest of us, so the product failed.Now, compare that approach to what Steve Jobs did with the iPod. He realized customers would likely prefer a phone with the iPods functionality. Instead of crippling the iPhone to protect the iPod, he leaned into the trend and ended up nearly owning the entire smartphone segment. That strategic move is largely why Apple is one of the most valuable companies in the world.If IBM had done the same with the PCjr, it would have protected its leadership position in PCs and might have been in that business today.The next modular effort was the Archistrat 4s server from a company staffed with ex-IBM people called the Panda Project. It used a passive backplane, allowing components to be easily plugged in or removed to add accessories. The entire system was housed in a vault-like case made of heavy metal, lockable for security, and designed to be bolted to the floor so offices without server rooms could keep it in open areas without worrying about theft.That company failed because the executives had inferior financial skills and wasted a lot of money on parties, jets, and other things not related to the business. The next attempt was the IBM MetaPad, a modular computer with a core computing unit about the size of a cigarette pack, containing everything except a battery, keyboard, or display. It could be placed in a laptop case or docked on a desktop.A company called Antelope Technologies pursued a similar concept, aiming to create a market for it. However, due to its unique design and low initial production volume, it was relatively expensive, as were the docks that could have been integrated into cars, hospitals, hotel rooms, or even airplane seats, providing full PC functionality and an easy upgrade path for the core technology.The problem was that the core module wasnt very powerful due to thermal and cost constraints, and docks for it never became widely available in the places they were envisioned. As a result, it never reached the necessary volume to drive down costs. Performance was also an issue mobile PC technology at the time wasnt efficient enough to make the concept viable in a laptop form. By then, IBM had begun shifting away from the PC market and was unwilling to throw money at a project like this. So, it died.More recently, Dell developed a green project, Concept Luna, that used a robotic upgrade machine that could be placed at locations like Best Buy so users could quickly and cheaply upgrade, modify, or personalize their laptop. This was one of the most well-thought-out concepts Ive ever seen. The benefits started with a massive potential for reduced e-waste, lower lifetime PC ownership costs, and potentially more loyal customers. Sadly, Dell decided not to bring it to market.And that brings us to Intels New Modular Laptop and Desktop VisionLike Dells Concept Luna, Intels modular effort appears primarily focused on reducing e-waste. It would have similar advantages to other earlier modular concepts by reducing life cycle costs, increasing design innovation by making it far less expensive to develop new design concepts, allowing the parts of a laptop that dont wear out to remain in service, and enabling greater customization in the market.This approach could lower upgrade costs for Intel and other component manufacturers while driving post-sale demand for CPUs and GPUs. It would also make it far easier for users to repair their laptops and provide relatively low-cost upgrades for almost every internal component.On the desktop side, Intel appears to be revisiting the Panda Project concept of a passive backplane, making it easier to upgrade motherboard components. Even though desktop PCs are more straightforward to upgrade, swapping out the motherboard is a PITA, and I often just donate my old desktop and start with a new case when I get a new motherboard. With Intels design, instead of unscrewing the motherboard and hoping the new one fits, users could simply replace one or more plug-in cards, similar to the PCjr concept decades ago. Water-cooled implementations might still be tricky, but you could get around that by using quick disconnects and CPU modules that would hook up to the cases water-cooling solution.This approach could enable users to swap out a PCs CPU as easily as a GPU, enhancing chip-level competition and expanding the total addressable market (TAM) for modular components like CPUs, chipsets, and modems.In short, it would increase revenue in the market, allow for creativity in desktop and laptop PC design, and significantly reduce e-waste.Wrapping UpWhats kind of ironic is that back in Andy Groves day, Pat Gelsinger led an effort to increase PC innovation, making me think this effort might have been connected to him somehow. In any case, it showcases that Intel isnt done innovating and is still working hard to change the future of the PC. I genuinely hope it can bring this concept to market.HP EliteBook G1a 14-inch NotebookLast week, I got a new laptop to play with, the HP EliteBook X G1a 14-inch. The one they sent me didnt have the OLED display that I would have favored, but the standard display isnt bad, and its a lot easier on the battery since the OLED screen reduces battery life by around two hours.This laptop is one of the first to use HPs Poly Camera Pro software, which significantly improves the video conferencing experience with more natural backgrounds and a host of features that can make you look better in remote meetings.It is a full Copilot+ machine based on AMDs latest processor and graphics technology, and it performs like a champ. AMD has really stepped it up with mobile computers this year, and unlike older AMD designs which tended to lag Intel, this effort is very competitive.The EliteBook X G1a starts at just under $2,200 and can go up to nearly $2,800 if you want a touch OLED display, more storage, and more memory. All of the display options are 400 nits, which is fine, but they will likely wash out in direct sunlight although HP does a nice job with antiglare technology, so it should work well enough outside in a pinch.The HP EliteBook X G1a 14-inch Notebook (Image Credit: HP)Without the OLED screen, it will get up to 17 hours of battery life watching movies, but less if you do something CPU- or GPU-intensive. With an NPU and 55 TOPS, this is one of the better AI PC configurations, and HP has gone to a lot of trouble to make it really quiet.The laptop uses an impressive amount of recycled material, though youd never know it from its premium design. While not a hardened PC, it has passed some military-specified tests, suggesting it is more robust than most in its class. As with most HP PCs, it includes strong security features, such as blurting the screen if it detects someone looking over your shoulder.Beyond its durability and security features, this is also one of the first AI PCs to fully embrace AI capabilities, including an AI companion for workflow assistance, an AI-enhanced webcam, rapid document and chart analysis, and an AI-powered service and support experiencesomething likely to become more common.Designed for tech experts like software developers and IT executives, the laptop delivers strong performance that both groups will appreciate. While not built for gaming or workstation-level tasks, it remains a high-performing PC that competes well in its category.Overall, the HP EliteBook G1a 14-inch is a pretty awesome laptop and my Product of the Week.
    0 Comments ·0 Shares ·47 Views
  • New species of fuzzy sunflower found by national park volunteer
    www.popsci.com
    The wooly devil (Ovicula biradiata) is a new-to-science flowering plant in the sunflower family. James Bailey California Academy of SciencesShareA photo uploaded to popular citizen science social network iNaturalist is a snapshot of the first new genus and species of plant discovered in a US national park in almost 50 years. The wooly devil (Ovicula biradiata) was found in Big Bend National Park in Texas with bright red petals and is a member of the sunflower family. It is detailed in a study recently published in the journal PhytoKeys.Big Bend National Park is located within the Chihuahuan Desert. This well-studied region is the largest and most biologically diverse warm desert in North America and is home to coyotes, quail, wild horses, alligator lizards, and more. While the Chihuahuan Desert has been the subject of several botanical surveys over the last 100 years, this is the first new plant genus in a national park that scientists have described since 1976. That plantthe mountain-dwelling shrub July gold (Dedeckera eurekensis)was found in Death Valley National Park. Get the Popular Science newsletter Breakthroughs, discoveries, and DIY tips sent every weekday. By signing up you agree to our Terms of Service and Privacy Policy.While many assume that the plants and animals within our countrys national parks have probably been documented by now, scientists still make surprising new discoveries in these iconic protected landscapes, Isaac Lichter Marck, a study co-author and California Academy of Sciences plant taxonomist and ecologist, said in a statement. O. biradiata is a member of the sunflower family, although it does not resemble its sunburst-shaped relatives at first glance.The team sequenced the plants DNA and compared it with other specimens in the California Academy of Sciences herbarium. The sequencing revealed that this small and fuzzy plant is both a new species within the sunflower group and distinct enough to be a completely new genus.Park volunteer and study co-author Deb Manley first spotted the plant in March 2024 and harnessed the power of international botanist crowdsourcing to identify this unknown species. O. biradiata is the type of plant that botanists colloquially call a belly plant. These are small and discreet plants that can only be properly observed by lying on the ground. It is a distinctive wild flower with furry white foliage and maroon-colored ray petals. O. biradiata is also quite an ephemeral species, only blooming after rain. It is found in harsh rocky habitats with limited rainfall and grows alongside drought-tolerant shrubs, such as ocotillo, hedgehog cactus, and creosote.The small, fuzzy flower grows in the harsh, rocky soils of the Chihuahuan Desert and only appears after rainfall. CREDIT: Big Bend National Park. Plants that thrive in deserts are often quite unique, having evolved specific mechanisms to withstand the extreme drought-and-deluge conditions of these arid landscapesfrom water-storing structures to rapid life cycles triggered by rain, said Lichter Marck. But as climate change pushes deserts to become hotter and drier, highly specialized plants like the wooly devil face extinction. We have only observed this plant in three narrow locations across the northernmost corner of the park, and its possible that weve documented a species that is already on its way out.Ovicula biradiatas name was inspired by its wooly appearance and the bright red petals. Ovicula means tiny sheep, and refers to the thick, white hairs that cover its leaves. It also honors one of Big Bends most iconic endangered speciesthe bighorn sheep (Ovis canadensis). Biradiata, or bi-radial, refers to the two ray florets loaded on each of the plants flowers. The team working with the plant affectionately called the fuzzy flower the wooly devil, which has become its suggested common name.[ Related: Parasitic orchids ditch photosynthesis for fungi. ]Now that the species has been identified and named, there is a tremendous amount we have yet to learn about it, study co-author and Big Bend National Park botanist Carolyn Whiting said in a statement. Im excited to discover whether there are other populations in the park, the details of its life cycle, what pollinates it, and whether well observe it this spring, given the current drought.Visitors to Big Bend can contribute by documenting wildflowers they encounter following upcoming spring rains and uploading their observations to iNaturalist. The team from this study will also be probing to see if the wooly devils potential medicinal properties.Under the microscope, we noticed specific glands that are known to possess compounds with anti-cancer and anti-inflammatory properties in other plants within the sunflower family, study co-author and California Academy of Sciences researcher Keily Peralta said in a statement. While further research is needed to determine these properties, this discovery underscores the potential knowledge we stand to gain from preserving plant diversity in fragile desert ecosystems.
    0 Comments ·0 Shares ·44 Views
  • Author Correction: The Ronne Ice Shelf survived the last interglacial
    www.nature.com
    Nature, Published online: 24 February 2025; doi:10.1038/s41586-025-08806-5Author Correction: The Ronne Ice Shelf survived the last interglacial
    0 Comments ·0 Shares ·45 Views
  • Author Correction: HIV-1 Env trimers asymmetrically engage CD4 receptors in membranes
    www.nature.com
    Nature, Published online: 24 February 2025; doi:10.1038/s41586-025-08802-9Author Correction: HIV-1 Env trimers asymmetrically engage CD4 receptors in membranes
    0 Comments ·0 Shares ·47 Views
  • its just for fun
    i.redd.it
    submitted by /u/rimsckei [link] [comments]
    0 Comments ·0 Shares ·51 Views
  • What is your typing speed? (WPM) [ K70 CORE TKL ]
    x.com
    What is your typing speed? (WPM)[ K70 CORE TKL ]
    0 Comments ·0 Shares ·47 Views
  • Re @ayebrianYT @elgato Our Platform desks do have a basket underneath to help with cable management
    x.com
    Re @ayebrianYT @elgato Our Platform desks do have a basket underneath to help with cable management
    0 Comments ·0 Shares ·46 Views
  • Get Started With Physically Based Lighting In UE5 With PBL Database
    cgshares.com
    In case you missed itYou may find these articles interestingRealistic & Stylized Lighting Studies Made With Unreal Engine 5Breakdown: The Science Of LightingIntroductionHi everyone, my name is Arthur Tasquin, and Im a Real-Time Artist with over seven years of experience using Unreal Engine. I previously worked in the VFX industry on several virtual productions, real-time experiences, and previsualizations, and Im now interested in the gaming industry.The subject of this article is the release of my next Unreal plug-in, a tool Ive been spending most of my time on for the past few months: PBL Database. While learning about the physically based lighting workflow in 2024, I was surprised by the lack of resources available online. I decided to scout for any information I could find on the subject and put what Id learned to the test through a project called Look Up Doodles.I described my entire process and combined all my discoveries in a voluminous 80 Level article: The Science of Lighting. While we will briefly review a few key notions here, I highly recommend you read that one thoroughly, as it will help you build a solid understanding of the basics. In this article I also teased a little in-editor tool that grew in scale over the months to become the plug-in Im presenting you today.What is PBL?PBL stands for Physically Based Lighting, a workflow that consists of using real-life light values and exposure settings for CG lighting.Why should you use the PBL workflow?Learn the rules like a pro, so you can break them like an artist. Pablo Picasso.Im a strong believer that learning the way things operate under the hood will improve your final result. By integrating core principles and natures laws, youre building mental pathways that will be useful for your future creative endeavors.In the field of lighting, using physically based data helps you achieve consistency between your different sources of light. The numbers can go from 0.001 for a candle to 100.000 for the sun, so it can be complicated to achieve both a realistic and pleasing balance with arbitrary values. Moreover, when using the PBL workflow, you emulate the cameras limitations.When to stop?A word of advice: when using this workflow, its easy to get fixated on the technicality. If you spent the last hour trying to fit two values together, youve lost yourself in the details. Take a step back and remember that PBL is only a tool and should only be a means to an end.Breaking the rules of physics is not wrong if its intentional. Movies and video games are, in the end, smoke and mirrors. Dont let physical values stop you from achieving a specific look youre after, a mood, or an idea. Use it only as a starting point and tweak it as much as you want.PBL DatabaseThe PBL Database came to life from the need for guidance in the learning process of the PBL workflow. When studying the subject, one thing struck me: I couldnt find enough data to cover the usual lighting scenarios. A few websites or images were often referenced, but they only provided so little. The information was scattered across the internet, and most of the time, the sources were not even there. From that moment, I decided to take matters into my own hands and do the dirty work. Ive spent 2024 sampling lighting data everywhere I went to build the most physically accurate collection of PBL values I could.The PBL Database is a time saver, by gathering both external and internal resources coupled with a set of unique tools, this plug-in is the perfect place to start learning PBL.Here is a non-exhaustive list of its features:Access to a growing database of physically accurate lighting data (lux, cd/m, lm, and EV100) with sampling notes and proof (pictures and videos of the sampling environment, light meter, and sky);Everything is in one place, everything in the engine;Sorting of the data based on multiple modifiers: Unit, Location (Outdoors or Indoors), and Type of Data (Artificial or Natural Light);A lighting scenario selector with two parameters (i.e. cloud coverage and time of day) that provides the exact data youre looking for;A description panel showing the definition based on your modifiers selection to ease the learning process;External resources (useful cheat sheets, relevant links) are in the palm of your hand;The Sunny 16 rule and its derivatives with the possibility to dynamically change one of the two camera settings to fit your needs;The Camera Settings Convertor: a tool to understand the relationship between Shutter Speed, Aperture, and EV100 and find the best camera settings for your scene;A Light Temperature converter to find the right RGB color for your emissive.The plug-in is divided into three tabs: Natural, Artificial, and Utilities. At the top of each tab, the common toolbar gives you access to external resources like cheat sheets and useful links and also a shortcut to the HDR View mode.Natural & Artificial LightingThe Natural and Artificial tabs look almost identical because they work the exact same way. The only difference is the type of data they take into account. Lets see how we can use them.First, you need to figure out what kind of data youre looking for. Select the tab depending on your light source: natural (sun and sky) or artificial (electricity). The modifiers will then help you narrow down the search.ModifiersThe Unit toggle will define the type of actors youre working with in Unreal. It is really important you understand what lux, lm, and cd/m are to work in PBL. Each of them has a different approach both in capturing the data and using the data.The Illuminance (lx)The Illuminance, calculated in lux (lx), is the quantity of light that hits a surface. It is captured through an incident light meter, also called an ambient meter or a lux meter in my case. This device samples the light hitting a little white hemisphere.This is the type of data youll find the most in the PBL Database, as I only used a lux meter in my sampling session. All the other types of data are gathered from the web.In Unreal, there are two places you will find this unit:The Directional Light intensity;The HDR View mode: Unreals own light meter you can use to monitor the lux of your scene. We will go over this feature in lighting studies below.The Luminance (cd/m)The Luminance, calculated in nits (cd/m), is the quantity of light that is emitted from a surface. It is captured by a reflective light meter, also called a spot meter. This kind of device is what you can find inside cameras to estimate the correct exposure of what you aim at. None of the luminance data was captured by me but Im planning to buy a spot meter for a future update (more on this at the end of the article).In Unreal, there are two places you will find this unit:In any emissive material (lamps, screens or sky sphere, HDRI Backdrop);The HDR View mode.The Luminous Flux (lm)The Luminous Flux, calculated in lumen (lm), is a measure of the perceived power of visible light emitted by a source. Nowadays, you can find this data on any bulb packaging to express the brightness of the light source.In Unreal, you will find this unit in every local light (point, rectangle, and spot light) as long as you set up the light unit correctly in the Project Settings. As you can imagine, this unit is only available in the Artificial tab.The LocationMost of the time, working with natural lighting implies an outdoor setting, and artificial lighting is an indoor one. Nonetheless, it happened that I needed to capture a street light at night or a sunlit office in the middle of the day. For that reason, you can tweak the location depending on the case and on the matching data.DataOnce youre happy with your modifiers, you can explore the available lighting scenarios. If your modifiers are lux and outdoors, you will have the choice between the cloud coverage and timing of the samples.Each scenario can contain one or more samples. In this case, on a clear sky late afternoon, the first row indicates sunlit values (by placing the light meter in the direct sunlight) and the second one shade values (by placing the light meter in the shadows). A row not only contains an average value but also a range and a median EV100.Why do you need a range?In reality, a lot of factors can alter the data. I call them factors of variation: you can find them in the description of each lighting scenario. Numbers depend on the weather, the pollution, the time of day, the season, the location, the surroundings, and the clouds. It can also be affected by our own sight differences, the device range of error or tilt, and more.This makes it difficult to build a consistent database as you cant reproduce the exact setting I was in at the moment of sampling. Thats why I designed the tool in a looser way. By introducing a range that comes from a collection of data through multiple sample sessions, I remind you that the most important thing is the overall scale. While the suns impact on earth can vary greatly, often by thousands of lux, the intensity of a candle will fluctuate within a way smaller range. Its your role to adapt the physical dimension of the scene to your lighting intentions.Edit (Experimental)The Edit section is an experimental feature that should only be used if the user knows what theyre doing. Its main purpose is to directly tweak the lighting actors from the scene based on the selected data. While the Post-Process Volume and the local light (in the artificial tab) will work as intended, the way you use the Directional Light and HDRI Backdrop will impact how physically accurate your lighting is. This will be further developed in the HDRI vs. Sky Atmosphere section.RecapLets review the process we just saw:Is the lighting artificial or natural?What unit are you searching for? 1. lux: for sunlight or ambient light checking (with the HDR View mode); 2. lm: for artificial local light; 3. cd/m: for sky sphere or emissive objects;Is the location of the scene outdoors or indoors?Once you answer those questions, you can display the data based on your lighting scenario selection and copy paste them into the appropriate actors!UtilitiesThe plug-in also comes with a series of utilities:The Camera Settings Convertor highlights the relationship between the Aperture, The Shutter Speed, and the EV100;The Sunny 16 rules are methods to estimate the correct daylight exposures without a light meter;The Kelvin to RGB allows you to find the right emissive color for a given Kelvin temperature.I wont go over them here, but Ill use some of them in one of the practical examples below.Lighting Study: The High School Science ClassDisclaimerBefore we get into practical examples, something needs to be addressed. While I spent a lot of time studying the subject, Im not done learning about the PBL workflow. When you scratch the surface, things can get really complicated and scientifically more advanced. Although I dont have an engineering degree, I tried to translate my understanding of the matter. My point of view is that of an enthusiast who saw an opportunity to reduce the friction it takes to learn the subject.In the following section, I will cover how I integrate PBL into my work. Ill try to be as transparent as possible regarding its physical flaws. With that in mind, you should not take for granted everything I say here. Make up your own opinion through experimentation and iteration. This is especially the case for use in production: the PBL Database should not substitute a proper pipeline.If you do have experience in the matter, I would be very happy to chat and connect. This subject, unfortunately, still lacks coverage, and the few resources out there often point in different directions.Keep in mind there are always several ways to get things done, choose the one that suits you the most. The tool is meant to help you in your lighting journey but you dont need to use every feature it contains.HDRI vs. Sky AtmosphereIm confident to say that the plug-in is as physically accurate as it could be (see Why do you need a range?), but how you use it is where it will really differ. In this section, were going to talk exclusively about natural lighting and the consequences of using the Edit Mode from the Natural tab.To understand the issue we need to go back to the way I sampled the data. Each scenario comes with a collection of values I took in different places and at different times using an incident light meter. This kind of device records the illuminance (lux) of a specific spot in space. Those samples represent most of my own contribution to the database (the rest being gathered from the web) and they should be used with the HDR View mode.If we take the supermarket example, which is only lit by artificial light, the values in my database vary from 79lx to 1621lx. The process would be to place supermarket lights in your environment and then use the HDR view mode to monitor the illuminance of the place. A few questions could then arise: Is the ceiling too high? Do I need to increase the number of lights? Is the placement of the lights correct?This way of working is perfect because youre working with two types of units: first the brightness of the lamp in lumen and then the illuminance of the place in lux. While this works for artificial lighting, its a whole different thing for natural lighting as the suns unit is lux.In real life, the sun doesnt change its intensity. The only thing that makes the result differ on my incident light meter are the factors of variation and the most impactful one is the cloud coverage. By going through the atmosphere and the layers of clouds, the light scatters, and a fraction of its original intensity reaches me. So, how can we reproduce that in Unreal?The first answer is the Sky Atmosphere. This is both the easiest and the most physically accurate one. The way this actor operates is that it does all the math for you in terms of atmosphere scattering, and it drives the suns intensity and color accordingly based on its rotation. Coupled with volumetric clouds, you get appropriate shadows and light intensity, and the whole thing is dynamic. This means that the lux values available in the natural tab serve the same purpose as the artificial ones: monitor the overall illuminance of the scene. The only thing to do really is to put the suns intensity to 120.000 lux, as mentioned at the end of the Unreals official documentation.This method, though, comes with its drawbacks: by linking the sun to the sky atmosphere, you lose artistic control over the lighting. The sky is one of the most important and biggest light sources around, and its diversity in intensity and color is crucial to creating impactful lighting. Im a big fan of using HDRI in my work, and I dont really like the look of the Sky Atmosphere or the Volumetric Clouds. Now, I could use an HDRI that doesnt contribute to the lighting coupled with the sky atmosphere, but that would mean that the skylight doesnt take into consideration that amazing range of color from my sky. With the Sky Atmosphere, you lose control over the color and the intensity of the sunlight as well as the brightness of the sky.As my approach is more about creating the most pleasing visuals, I usually tend towards the HDRI approach, which is what were going to dive into. Keep in mind its not the most physically accurate workflow. It works for me, but it might not be relevant to you.Preparing The SceneFor the following lighting studies, I will use the High school Science Lab Classroom pack from Dekogon Studios, which has been an industry standard for years. By choosing this environment, Im confident I will work with optimized assets with a coherent art direction. My goal will be to push the quality as far as I can to produce beautiful static shots for a cinematic using only lumen and no post-production. While the PBL workflow isnt bound to real-time dynamic lighting, we will neither use static lighting nor path tracing to make those shots.The first thing to do for any lighting study is to inspect the level at which youll be working.Although its extremely well-optimized, you can see that the quality of the environment is quite impressive. We will build the lighting from scratch, so lets get rid of it. I usually prefer to move the original lighting into a sub-level so that I can compare it to mine. I will also work on my new lighting in another sub-level to keep things organized.Dont forget to right-click/Change Streaming Method/Always Loaded on your lighting sublevel if you want to render it. Without that option, your sub-level will not be loaded when you render.Our first issue arises: turning off the original lighting doesnt remove it from the scene. This occurs because the lighting is baked. To remove all baked lighting, you can rebuild the lighting (Build/Build Lighting Only) or turn off Allow Static Lighting (Engine/Rendering/Misc Lighting) in the Project Settings. If your plan is to only use dynamic lighting, I would recommend the second option as it will be cleaner and more optimized.While were on the subject of Project Settings, there are a few of those you should consider if you want to output the best result from the Engine. Heres a list of the changes I did for this project:All of these settings are located in Engine/Rendering.Natural Lighting A Sunny AfternoonFor this example, well go with a typical sunny afternoon. I want the warm light of the sun to fill the room and play with the contrast.The first thing to do is spawn the basic light actors: the Directional Light, the HDRI Backdrop, and the Post-Process Volume. You can see that we already have a few issues here: the sun seems to bleed through the walls and the ceiling, the lens effects are way too strong, and the lumen isnt working properly.As this environment was made with a modular approach in mind, all the foundations of the room were divided into blocks. This, coupled with the thickness of the assets, can introduce light bleeding, but theres a very simple way to fix it. By placing unlit dark blocks outside the room, Im making sure that the only way the light comes through is through the windows.When lighting a scene, its always better to start with a neutral setup. We dont want any effect to clutter the space. Thats why I always set the Post-Process Volume to unbound and the Bloom, Lens Flare, and Vignette to 0.For this scenario, I chose an HDRI from the website Location Textures, which provides a lot of free, high-quality content. Something I really like about them is that each of their images comes with additional data. I chose this HDRI both for the look of the sky and the time of capture that matches our scenario. You can see that the photo was taken in the UK on 01.08.2019 at 3:30 PM.When I know the timing, I usually like to inspect where the sun is in the sky at that moment. With Time and Date, you can retrieve sky data from a date and a location. By entering those parameters, you can see that the sun at 3:30 PM was in the first half of the afternoon (between solar noon and sunset), which in my tool is indicated as Early Afternoon.Using The ToolThe SunNow that the scene is neutral, that we got rid of all the issues and that the basic light actors are there, we can start using the tool. Like I said earlier, I want a sunny day in the early afternoon and I will start with the sun. To retrieve those data, I need to be in the natural lighting tab with the lux unit selected and I will first balance my scene for an outdoor setting.How can we interpret that data?In direct sunlight (first row), the range varies from 9155 lx to 86700 lx. This is a huge difference (x10 factor), and I need to understand why;The notes indicate that the smallest samples were taken in a forest, this might have affected the result;There are two times of sampling: the first one matches the timing of my HDRI (date and hour).I want to see the behind-the-scenes to understand how those values were taken. To do so, we can click on the link icon next to the scenario, which will bring us to a Google Drive. After choosing the date and inspecting the pictures, I can say that in a more open area, the illuminance is closer to 80 K lux, which is in the high part of the range.Now that we understand the values lets try to use them. With the Edit mode, we can iterate within the given range and find the proper exposure for our scene. Ill set the Directional Light intensity to 77.000 lux.A proper exposure only means keeping as many details in both the shadows and the highlight. Its important to start with balanced settings but it doesnt mean that you will have to stick to it until the end. Right out the bat, we can see that the numbers work pretty well. The sunlight isnt blown out or too dark, but we have lost our sky. Dont worry its totally normal as its not yet set to a more physical range like we just did with the sun.One last thing I would like to do is to verify that my suns impact on the scene is the same as its intensity. Using the HDR View mode, Unreals own in-engine light meter, we can see that the upper square shows 76179 lx, a value very close to the one I used.The SkyTo calibrate our sky, we will need the data available in the cd/m toggle.The numbers for a clear sky vary from 7000 to 10.000 cd/m but what does that really mean? HDRIs are far from uniform, they vary from pixel to pixel both in intensity and in color. Likewise, the skys luminance completely changes depending on where you sample it. This is why any luminance value should always go with a description to understand where and when the value was taken. In our case, the range covers the intensity of a clear sky during the day at the horizon. Our goal now is to tweak the intensity of the HDRI Backdrop until we reach a value that sits in that range. Lets click on the average value and paste it into the HDRI Backdrop intensity.From a quick glance, it seems pretty well exposed, but lets review this with the HDR View mode.1 of 2The luminance meter (lower square of the HDR View mode) indicates 7979 cd/m at the opposite side of the sun; its pretty close to our range, so Im going to leave it that way. As the values fluctuate depending on where you point the HDR View mode, I always like to cross-reference information, so for this example, I also used the white illuminated cloud value of 10.000 cd/m.Although my incident light meter covered a substantial range of scenarios, I didnt capture any luminance data as I needed another device for that. All the luminance values are coming from external resources that you can still access through the link icon. The lack of those makes this step less precise and requires you to eyeball the perfect luminance. This is something I would like to improve for a further update.Our scene is now well-exposed for an outdoor setting, but were dealing with an interior here. If you go inside the room, everything is pitch black except the sky outside the windows. The ideal EV100 for an interior is 5 or 6, depending on the amount of light. You can find that in the EV Cheat sheet from the upper toolbar.1 of 2The question here is, what do you want to expose for? As we used physically accurate lighting data, we emulated the cameras limitations. You cant expose both the highlights from the sun and the shadows in the class. Like a photographer, you need to choose what the subject of the photo is.The rest of the process is quite similar to taking pictures with a real camera. You just have more control over the scene if you want to change the lights or play with the shadows. I also added an Exponential Fog, turned on the volumetric checkbox, and played with a few post-process settings.Here are a few shots from that session:1 of 3Artificial Lighting A Neon NightSo, what changes when youre dealing with an Artificial scenario? As previously mentioned, you will mostly use local lights, which are expressed in lumen.For this light study, I would like to use the neons already placed in the environment to light the classroom at dusk. Most of the steps we saw in the previous example are very similar.Although Im not using sunlight, I would still like to place a blue hour/dusk very fainted HDRI in the scene. In this example, I used the Blue Clouds 2686 from Location Textures. The process is exactly the same as with the natural setting: I sample the sky with the HDR View mode, and based on luminance data, I tweak the intensity of the HDRI Backdrop.Thanks to the EV Cheat Sheet, we already know that the average interior exposure is around 5-6. Lets put 5 in the Max and Min EV100 of the Post-Process Volume and enter the classroom. The result is pitch black, and it shouldnt be otherwise, as we didnt introduce any light into the environment.Neon LightsThe project already comes with neon lights that have an emissive channel. Turning on the Emissive should make a difference in my scene, but its not the case. Why is that?Like often, Unreal default values dont represent the real scale of light intensities. Similarly to the default 10 lux of the Directional Light, having an emissive value of 1 will only be seen if your EV100 is at 1.As a base, lets crank up this value to 500 and see what happens. The setup doesnt come with an emissive strength parameter but a color so what I did here is to increase the value of the color itself.This shows us something really important: the impact of emissive objects on our lighting. Lumen is a great tech, but when working with PBL, it can get in the way. Were going to light our scene with local lights using real-life light intensities in lumen. On top of that, we need to use emissive material to simulate where the light comes from, and we need to keep in mind that they will add up to the already calibrated lighting. This might not sound like an issue when dealing with an intensity of 500, but were far from the reality here. This is where Unreal falls apart in terms of PBL: heres what the scene looks like when using calibrated values from a spotter pointed towards a fluorescent lamp.The only solution would be to exclude an emissive object from Lumen, but I couldnt find any way to do that. For now, we will trust our gut feeling and tweak the emissive intensity accordingly.There are several ways to link a local light to the neon mesh, but I chose to use a rectangle light just beneath the two cylinders. To be a bit more accurate, I couldve put two point lights exactly shaped as the neon (using the source length and radius) and removed the shadows of the tubes, but I didnt want to double the number of lights in the scene.The rest is pretty straightforward, I used the light intensity range available through the lm modifier to tweak the rectangle lights. Neon lights vary from 500 lm to 5000 lm so I need to think of how they are used here: a science lab is a place where you need to see clearly what youre doing, it needs a functional lighting. I can then use the HDR View mode to compare the illuminance of the place to another functional place I have in the database.UtilitiesTo finish this study, I would like to go over two utilities. The Kelvin to RGB will help us translate a light temperature into an actual emissive color. Thanks to the Light Temperature Cheat Sheet, we know that the Fluorescent temperature is at 4000 K. After changing the rectangle light itself, we can use the tool to output the appropriate color.Although EV100 is a real measurement, it is not used that much by photographers as they work mostly with their camera settings instead: Aperture, Shutter Speed, and ISO. The first utility allows you to do that. In our case, we know that the EV100 we aim for is 5 and we want to see what are the different Apertures available while keeping the same exposure. You can see below that to have an exposure of EV100=5 with an Aperture of 1.4, you need to set your shutter speed to 15. This is very interesting if you want to control in a realistic way the motion blur or depth of field.For this tool to be usable, you need to set the Exposure Metering Mode of your Cine Camera or Post-Process Volume to Manual.Once your lights are calibrated, youre free to change the space how you like it and to tweak what doesnt work for you. In my case, I reduced the number of lights in the environment to introduce more contrast and played with the fog:1 of 2ConclusionLearning the PBL workflow can be a real headache, trust me, Ive been there. Thats the reason why I started to share my thoughts and discoveries and developed this plug-in. I hope it will help you in your own lighting journey and that it will spark some public discussion about the topic as the number of resources out there are too limited.Talking about resources, here are a few recommendations:My previous 80lv article: The Science Of LightingTalking Physically Based Lighting With Scott Warren, EMC3DLighting with Eros, Visual Tech ArtUltimate Lighting Course, Tilmann MildeThe PBL Database is now available on my Fab page!I want this plug-in to be an evolving resource based on your needs. If you want to submit feedback, you can do it on this form. On top of buying a spot meter and improving the database by covering more scenarios, I already have tons of ideas for additional features like a favorite system, a way to add your own data or templates for quick PBL lighting setups.Before ending this article, I would like to thank a few people whose work I admire and who were kind enough to try out the tool before its release and give me feedback:Elliot McSherry,Corentin Wunsche,Ted Mebratu,andEros Dadoli.If you dont want to miss anything from me, be sure to join my newsletter or add me on LinkedIn! In case youre searching for an Environment or Lighting Artist, Im also looking for a job in the gaming industry. My work is available on ArtStation and on my ownwebsite, where I write blog posts about visual theory.Source link The post Get Started With Physically Based Lighting In UE5 With PBL Database appeared first on CG SHARES.
    0 Comments ·0 Shares ·48 Views
  • RT AidyBurrows: OUT NOW! Game Asset Workflow: A Complete Blender Guide. What is it? 11+ Hrs! A Step-by-Step Guide to Modeling, Sc...
    x.com
    RTAidyBurrowsOUT NOW! Game Asset Workflow: A Complete Blender Guide. What is it?11+ Hrs! A Step-by-Step Guide to Modeling, Sculpting, Texturing, UVs, Optimization and Exporting for Game Engines. Link below
    0 Comments ·0 Shares ·45 Views