A modern open source rendering engine for animation and visual effects
These are the release notes for appleseed 2.1.0-beta.
These notes are part of a larger release, check out the main announcement for details.
This release of appleseed has a DOI: https://zenodo.org/record/3456967
This release is the result of more than ten months of work by the incredibly talented and dedicated appleseed development team.
Many thanks to our code contributors for this release, in alphabetical order:
Many thanks as well to our internal testers, feature specialists and artists, in particular:
Interested in joining the appleseed development team, or want to get in touch with the developers? Join us on Discord. Simply interested in following appleseed's development and staying informed about upcoming appleseed releases? Follow us on Twitter.
appleseed now has native support for Cryptomatte via a pair of new Cryptomatte AOVs. Cryptomatte is a system to generate ID maps that work even in the presence of transparency, depth of field and motion blur.
(IKEA Home Office scene by Chau Tran. Feel free to download the original OpenEXR file to inspect embedded metadata.)
We've added render checkpointing, a mechanism to resume multi-pass renders after they were interrupted (voluntarily or not), and to add rendering passes to a finished render.
At the moment render checkpointing is only exposed in appleseed.cli. Eventually it should become available in appleseed.studio as well.
Here's an example workflow: when starting your multi-pass render (notice the --passes
option), you add the --checkpoint-create
option to create/update the checkpoint file after each render pass:
After you've interrupted the render with CTRL+C, or, Heaven forbid, after appleseed crashed, you can simply resume the render from the last complete render pass by adding the --checkpoint-resume
option:
You can also pass both --checkpoint-create
and --checkpoint-resume
at the same time to simultaneously resume rendering from a checkpoint and continuing updating it as new passes are rendered:
Finally, you can pass both options even if no checkpoint exists yet, in which case rendering will simply start from the first pass:
appleseed now has the ability to compile OSL source shaders on the fly. We currently exposed this feature in our Blender plugin. In the following screenshot, the user has written a small OSL shader that remaps its input to a color using a custom color map, then has connected it to the V texture coordinate of the object:
Future versions of the 3ds Max and Maya plugins will expose this feature in a similar manner.
We've added a new fisheye lens camera model with support for equisolid angle, equidistance, stereographic and Thoby projections.
Here is a render of the Japanese Classroom scene with the fisheye lens using a 120° horizontal field of view and a stereographic projection:
(Japanese Classroom scene by Blend Swap user NovaZeeke, converted to Mitsuba format by Benedikt Bitterli, then to appleseed format via the mitsuba2appleseed.py
script that ships with appleseed.)
We've added a way to control how many samples each pixel will receive based on a user-provided black-and-white mask. This allows to get rid of sampling noise in specific parts of a render without adding samples in areas that are already smooth. This is yet another tool in the toolbox, complementing the new adaptive tile sampler introduced in appleseed 2.0.0-beta and the per-object shading quality control that has been present in appleseed since its early days.
In the following mosaic, the top-left image (1) is the base render using 128 samples for each pixel; the top-right image (2) is a user-painted mask where black corresponds to 128 samples/pixel, white corresponds to 2048 samples/pixel and gray levels correspond to intermediate values; the bottom-left image (3) is the render produced with the new texture-controlled pixel renderer using the mask; the bottom-right image (4) is the Pixel Time AOV where the color of each pixel reflects the relative amount of time spent rendering it (the brighter the pixel, the longer it took to render it):
We switched appleseed to use Filter Importance Sampling instead of filtered sample splatting. This new technique has three advantages over the previous one: lower noise for a given number of samples per pixel, less waste (tile borders are no longer required), and statistically independent pixels, meaning in practice that modern denoisers should work a lot better when applied to images produced by appleseed.
To illustrate this last point, here is an appleseed render of the Modern Hall scene denoised with Intel® Open Image Denoise, a set of open source, high quality, machine learning-based denoising filters:
(Modern Hall scene by Blend Swap user NewSee2l035, converted to Mitsuba format by Benedikt Bitterli, then to appleseed format via the mitsuba2appleseed.py
script that ships with appleseed.)
as_matte()
closure and updated OSL shaders to use it.*.obj
and *.appleseed
files.idiff
tool./
.APPLESEED_SEARCHPATH
environment variable even if a project does not define any explicit search path.color
and alpha
parameters from color entities.required parameter "volume_parameterization" not found
error message whenever an OSL shader uses the as_glass()
closure.site-packages
and appleseed Python module's directories on startup.site-packages
or appleseed Python module's directories cannot be found.PYTHONHOME
environment variable and the likely consequences.APPLESEED_SEARCHPATH
environment variable were saved as explicit search paths in projects.--disable-abort-dialogs
option to disable abort dialogs (Windows only).--to-mplay
and --to-hrmanpipe
command line options.Added the following class to Python bindings:
ShaderCompiler
Added the following entry points to Python bindings:
AOV.get_cryptomatte_image()
Project.get_post_processing_stage_factory_registrar()
Project.get_volume_factory_registrar()
ShaderGroup.add_source_shader()
ShaderQuery.open_bytecode()
get_lib_compilation_date()
get_lib_compilation_time()
get_lib_configuration()
get_lib_cpu_features()
get_lib_name()
get_lib_version()
get_synthetic_version_string()
get_third_parties_versions()
oiio_make_texture()
Removed the following entry point from Python bindings (breaking change):
Project.reinitialize_factory_registrars()
as_invert_color
).as_asc_cdl
).as_subsurface
shader from 2 to 8.docs/osl/osl-languagespec.pdf
) to version 1.10.blenderseed/as_closure2surface.osl
shader as it is redundant with appleseed/as_closure2surface.osl
.Energy Compensation
parameter of as_metal
shader non-texturable.as_triplanar
shader.animatecamera
tool now varies the sampling pattern per frame.--motion-blur
to animatecamera
tool.--disable-abort-dialogs
option to all command line tools to disable abort dialogs (Windows only).mitsuba2appleseed.py
to be anywhere.These are the release notes for appleseed 2.0.0-beta.
These notes are part of a larger release, check out the main announcement for details.
This release of appleseed has a DOI: https://zenodo.org/record/3384658
This release is the result of six months of work by the incredibly talented and dedicated appleseed development team.
Many thanks to our code contributors for this release:
Many thanks as well to our internal testers, feature specialists and artists, in particular:
Interested in joining the appleseed development team, or want to get in touch with the developers? Join us on Discord. Simply interested in following appleseed's development and staying informed about upcoming appleseed releases? Follow us on Twitter.
We made a number of improvements to our random-walk subsurface scattering implementation:
randomwalk
SSS profile for the as_subsurface()
closure;as_randomwalk_glass()
closure.(Geometry, textures and environment map by 3D Scan Store, scene reconstruction and skin shader by Juan Carlos Gutiérrez.)
We did some initial work on non-photorealistic rendering support in appleseed. We added two new OSL closures, as_npr_shading()
and as_npr_contour()
, as well as a new OSL shader, as_toon
.
(Model by Brice Laville, concept by Tom Robinson, render by Esteban Tovagliari - RenderMan "Rolling Teapot" Art Challenge.)
(Original model by Blend Swap user Ricardo28roi, rig by Blend Swap user daren, render by Luis Barrancos.)
Finally, we added two new AOVs: NPR Shading (npr_shading_aov
) and NPR Contour (npr_contour_aov
).
This release introduces a new post-processing pipeline that allows to apply treatments to a render without leaving appleseed.
The key (but still experimental) component of this new post-processing pipeline is the Color Map stage: it allows to visualize the Rec. 709 relative luminance of a render through a number of predefined color maps, or through a custom color map defined by an image file.
Five predefined color maps are available: the venerable Jet color map popularized by MATLAB, and four modern, "perceptually uniform sequential" color maps from Matplotlib: Inferno, Magma, Plasma and Viridis.
A color legend can also be included in the render.
A future version of the Color Map stage may allow to visualize other quantities such as photometric luminance (in cd/m²), radiance (in W/sr/m²) or irradiance (in W/m²).
Beauty render:
(Country Kitchen scene by Blend Swap user Jay-Artist, converted to Mitsuba format by Benedikt Bitterli, then to appleseed format via the mitsuba2appleseed.py
script that ships with appleseed.)
False colors, Inferno color map:
False colors, Jet color map:
False colors, Magma color map:
False colors, Plasma color map:
False colors, Viridis color map:
In addition, the Color Map stage can render relative luminance isolines, that is, lines of equal relative luminance in the render:
Color mapping and isolines can also be applied right from appleseed.studio without having to add them as post-processing stages and re-rendering the scene:
Post-processed renders can be saved to disk:
Finally, the render stamp feature introduced in appleseed 1.9.0-beta has been converted to a post-processing stage (projects are automatically updated).
During his Google Summer of Code 2018 participation, Kevin Masson implemented a new adaptive tile sampler that provides superior performance over the former adaptive pixel sampler (which is now deprecated). The new adaptive tile sampler is based on a number of recent papers. Please check out Kevin's GSoC 2018 report for details and references to relevant papers.
Here is an equal-time comparison between the uniform tile sampler and the new adaptive one:
(Earth texture from Shaded Relief, scene and renders by Kevin Masson. Top-left: uniform sampling; top-right: adaptive sampling; bottom-left: Pixel Sample Count AOV (see below); bottom-right: difference between uniform and adaptive sampling.)
Close-up:
Here is another equal-time comparison:
(Cookies & Milk scene by Harsh Agrawal, render by Kevin Masson.)
Close-up:
Two AOVs have also been added:
pixel_sample_count_aov
), pixels in blue are those that received the fewest samples while pixels in red are those that received the most.pixel_variation_aov
), pixels in blue are those that contain the lowest noise while pixels in red are those that contain the highest.appleseed now supports Roughness Clamping, a trick popularized by Arnold that allows to reduce fireflies in scenes with lots of glossy and specular surfaces:
Roughness Clamping enabled:
Roughness Clamping disabled:
Close-ups:
We added an Albedo AOV (albedo_aov
) to capture the "base color" of surfaces. AOVs are not yet exposed in appleseed.studio, however the Albedo AOV is accessible via the new Albedo diagnostic mode that replaced the old Color one:
Here is the result of rendering the Wooden Staircase scene using the Albedo diagnostic mode:
(Wooden Staircase scene by Blend Swap user Wig42, converted to Mitsuba format by Benedikt Bitterli, then to appleseed format via the mitsuba2appleseed.py
script that ships with appleseed.)
This release includes as_sbs_pbrmaterial
, a new OSL shader that matches Allegorithmic's Substance Painter shading model in the metallic/roughness workflow.
You can find all about this new shader in the documentation.
Here are a few renders using this shader:
(Renders by Luis Barrancos.)
We significantly redesigned and cleaned up our shaderball, now in version 5:
(Render by Juan Carlos Gutiérrez.)
The shaderball scene is available in the following formats:
In addition, the shaderball is now also available as a low-poly model (15,140 triangles) in all formats.
We added a search paths editor to appleseed.studio. To open it, right-click on the top-level item (Project) of the Project Explorer and choose Edit Search Paths...:
In the search paths editor window, paths are ordered by ascending priority (paths lower in the list override those from higher up). Dark gray paths are those set via the APPLESEED_SEARCHPATH
environment variable and cannot be edited while light gray ones are explicit paths that can be edited:
fresnel_weight
parameter to as_glossy()
OSL closure (default is 1).position_aov
).color
mode has been replaced by the albedo
mode in the diagnostic_surface_shader
surface shader.PYTHONHOME
environment variable is defined, appleseed.studio will use the Python release (which must be of the Python 2.7 variety) it points to instead of the Python release bundled with appleseed. It was already the case on Windows; it is now also the case on Linux and macOS.dIdx
and dIdy
) in OSL.--save-light-paths
command line option.--select-object-instances
command line option by --show-object-instances
and --hide-object-instances
.--libraries
command line option to print third party libraries versions.Added the following classes to Python bindings:
BlenderProgressiveTileCallback
IPostProcessingStageFactory
IVolumeFactory
MurmurHash
PostProcessingStage
PostProcessingStageContainer
PostProcessingStageFactoryRegistrar
ProjectPoints
ShaderCompiler
ShaderGroupContainer
Vector2u
Vector3u
Vector4u
Volume
VolumeContainer
VolumeFactorRegistrar
Added the following methods to Python bindings:
AOV.get_channel_count()
AOV.get_channel_names()
AOV.get_image()
AOV.has_color_data()
Assembly.bssrdfs()
Assembly.clear()
Assembly.volumes()
AssemblyInstance.set_transform_sequence()
BaseGroup.clear()
Camera.set_transform_sequence()
EnvironmentEDF.set_transform_sequence()
Frame.get_input_metadata()
Frame.post_processing_stages()
MeshObject.reserve_vertex_tangents()
MeshObject.push_vertex_tangent()
MeshObject.get_vertex_tangent_count()
MeshObject.get_vertex_tangent()
ShaderQueryWrapper.open_bytecode()
Tile.get_storage()
TransformSequence.optimize()
APPLESEED_SEARCHPATH
environment variable when setting search paths from Python.Removed the following methods from Python bindings (breaking changes):
Frame.aov_images()
Tile.blender_tile_data()
Tile.copy_data_to()
Added shaders:
as_closure2surface
as_manifold2D
as_sbs_pbrmaterial
as_texture2surface
Added Blender metadata to all appleseed shaders.
Tweaked and improved help strings.
as_texture
shader.Removed shaders (breaking changes):
as_maya_closure2Surface
as_maya_texture2Surface
Removed all Gaffer-specific shaders.
THIRDPARTIES.txt
file with the full license text of all third party libraries used in appleseed.aspaths2json.py
, a Python script that can convert Light Paths files (*.aspaths
) to JSON.With some Linux distributions (Fedora 29, Arch Linux), appleseed.studio may fail to start with the following error:
./bin/appleseed.studio: symbol lookup error: /usr/lib/libfontconfig.so.1: undefined symbol: FT_Done_MM_Var
The reason appears to be newer libfontconfig
and libz
libraries in the system compared to those that ship with appleseed.
One solution is to force using the system's one:
env LD_PRELOAD="/usr/lib64/libfreetype.so /usr/lib64/libz.so" ./appleseed.studio
Another solution is simply to delete libfreetype.so.6
and libz.so.1
from appleseed's lib/
directory.
These are the release notes for appleseed 1.9.0-beta.
These notes are part of a larger release, check out the main announcement for details.
This release is the result of four intense months of work by the incredibly talented and dedicated appleseed development team.
Many thanks to our code contributors for this release:
And to our internal testers, feature specialists and artists, in particular:
Interested in joining the appleseed development team, or want to get in touch with the developers? Join us on Discord. Simply interested in following appleseed's development and staying informed about upcoming appleseed releases? Follow us on Twitter.
Artem Bishev was one of our Google Summer of Code participants last year. He took up the rather intimidating project of implementing volumetric rendering in appleseed, and did an admirable job. Make sure to check out his final report for a detailed account of the features he implemented during the summer and the results he obtained.
Artem did not stop contributing once the summer was over. Instead, he kept pushing forward and implemented the much anticipated "Random Walk Subsurface Scattering", a more accurate method of computing subsurface scattering that actually simulates the travel of light under the surface of objects.
The results are nothing short of amazing, especially since Random Walk SSS appears to be as fast or sometimes faster than diffusion-based SSS (the traditional, less accurate method). Here are a couple renders showing Random Walk SSS in action:
(Stanford Dragon, render by Artem Bishev.)
(Statue of Lu-Yu by Artec 3D, render by Giuseppe Lucido.)
Esteban integrated the BCD denoiser in appleseed. BCD stands for Bayesian Collaborative Denoiser: it's a denoising algorithm designed to remove the last bits of noise from high quality renders rendered with many samples per pixel (unlike other denoising techniques that basically reconstruct an image from a very small number of samples per pixel).
Here is the denoiser in action (click for the original uncompressed image):
Light paths can now be efficiently recorded during rendering, without significantly impacting render times. This is part of a larger project where appleseed is being used in an industrial context to analyze how light is scattering on or inside machines and instruments.
When light paths recording is enabled in the rendering settings, it becomes possible (once rendering is stopped) to explore and visualize all light paths that contributed to any given pixel of the image. Light paths can be recorded for the whole image or for a given region when a render window is defined. Light paths can then be visualized interactively via an OpenGL render of the scene. Light paths can also be exported to disk in an open and efficient binary file format.
Here is a screenshot of appleseed.studio showing one of the many light paths that contributed to a pixel that was chosen in the render of a 3D model of the Hubble Space Telescope made freely available by NASA:
Here is a short video showing the process of enabling light paths capture and visualizing light paths in appleseed.studio:
Jino Park and Fedor Matantsev implemented support for non-instantaneous opening and closing of camera shutters: it is now possible to specify how long a camera's shutter takes to open or close, as well as the rate at which it opens and closes.
Here is an illustration showing three cases:
It is now possible to have appleseed add a "render stamp" to renders. The render stamp is fully configurable and can contain a mixture of arbitrary text and predefined variables such as appleseed version, render time, etc. Here is what the default render stamp looks like:
Esteban also added a new Pixel Time AOV (or "render layer") that holds per-pixel render times (technically, the median time it took to render one sample inside each pixel). This new AOV can help figure out which areas of the image are most expensive to render, and thus help optimize renders.
Here is an example of the pixel time AOV (with exposure adjusted) that shows that grooves and other concave areas of the shaderball are more expensive to render than convex ones (the scattered white dots are likely due to the operating system interrupting the rendering process once in a while):
printf()
function.as_glossy()
closure.--to-stdout
.projecttool
can now remove unused entities and print dependencies between entities.as_blackbody
, as_blend_color
, as_bump
, as_composite_color
, as_fresnel
, as_subsurface
, as_switch_surface
, as_switch_texture
, as_texture
, as_texture_info
, as_texture3d
, as_triplanar
.as_layer_shader
to as_blend_shader
.as_disney_material
.Pref
, Nref
and UV coordinates to as_globals
.as_standard_surface
.as_standard_surface
.as_standard_surface
.These are the release notes for appleseed 1.8.1-beta.
appleseed 1.8.1-beta is a minor release fixing issues discovered in appleseed 1.8.0-beta.
as_plastic()
closure.dumpmetadata
command line tool.as_osl_extensions.h
) file.These are the release notes for appleseed 1.8.0-beta.
These notes are part of a larger release, check out the main announcement for details.
This release is the fruit of the formidable work of the appleseed development team over the last six months.
Many thanks to our code contributors:
And to our internal testers, feature specialists and artists, in particular:
Interested in joining the appleseed development team, or want to get in touch with the developers? Join us on Slack. Simply interested in following appleseed's development and staying informed about upcoming appleseed releases? Follow us on Twitter.
We introduced support for procedurally-defined objects, that is objects whose surface is defined by a function implemented in C++. This opens up a vast array of possibilities, from rendering fractal surfaces such as the Mandelbulb shown below, to rendering raw CAD models based on Constructive Solid Geometry without tessellating them into triangle meshes. One could even use this mechanism to prototype new acceleration structures with appleseed.
appleseed ships with three samples procedural objects in samples/cpp/
: an infinite plane object, a sphere object and the Mandelbulb fractal. This last sample is particularly interesting: it's actually a generic Signed Distance Field raymarcher and a great toy to tinker with if you like procedural graphics!
Some good resources about Signed Distance Fields:
We also introduced long-awaited support for procedural assemblies. In appleseed's terminology, an assembly is a "package" that represents a part of the scene. A procedural assembly is one that gets populated at render time instead of being described in the scene file. Somewhat schematically, a procedural assembly is defined by a C++ plugin which is invoked by the renderer when it needs to know the content of the assembly.
Procedural assemblies allow to procedurally populate a scene, for instance to generate thousands of instances of a single object based on rules, without having to store all those instances on disk.
Below is an example of a procedural assembly: it is a modern recreation of the sphereflake object designed by Eric Haines et al. and described in the 1987's article A Proposal for Standard Graphics Environments, rendered using subsurface scattering (using the Normalized Diffusion BSSRDF), a Disney BRDF and image-based lighting:
For reference (and for fun; who doesn't love retro CG!) here is one early rendering of the Sphereflake by Eric Haines:
(Source: https://www.flickr.com/photos/101332032@N06/10865071225)
A particular type of procedural assembly supported by appleseed is an archive assembly. An archive assembly is defined by a bounding box and a reference to another appleseed scene file (extension .appleseed
or .appleseedz
). The referenced scene file is loaded by the renderer at the appropriate time. This is a common but powerful mechanism to assemble large and complex scenes out of smaller parts.
One important use of procedural assemblies in production is reading parts of scenes from Alembic archives, Pixar's USD scenes and other custom file formats. We recently started working on adding support for loading geometry directly from Alembic archives.
This release saw the previous AOV mechanism completely ripped out and reimplemented from scratch. While flexible, the previous AOV mechanism made it hard or in some cases impossible to composite AOVs back together. Moreover it lacked fundamental features such as splitting direct lighting from indirect lighting contributions or splitting diffuse from glossy scattering modes.
The new AOV subsystem implements what you expect from a modern renderer. appleseed currently supports the following AOVs:
Here is an example of AOVs rendered from our shaderball scene:
And the beauty render:
AOVs must currently be declared manually in the scene file:
<output>
<frame name="beauty">
<parameter name="camera" value="/group/camera" />
<parameter name="resolution" value="640 480" />
<parameter name="crop_window" value="0 0 639 479" />
<aovs>
<aov model="diffuse_aov" />
<aov model="glossy_aov" />
</aovs>
</frame>
</output>
The next release of appleseed will allow adding/removing AOVs directly from appleseed.studio. It will also add AOVs for our render denoiser, and if time allows, it will introduce native Cryptomatte support. We are also considering adding a motion vectors AOV in a future release.
The color pipeline is another area that received a ton of attention in this release.
First of all, remember that appleseed is fully capable of spectral rendering, currently using 31 equidistant wavelengths in the 400 nm to 700 nm visible light range.
In the name of speed and accuracy, we no longer switch on-the-fly between RGB and spectral representations during rendering. We had a neat mechanism that supported this feature for many years but it had three main problems: it prevented us from being completely rigorous about color management, it incurred tremendous complexity in the very core of the renderer and it had nasty edge cases causing severe performance issues. Instead, we now let the user decide which color pipeline to use: RGB or spectral:
The RGB pipeline is the default and for most purposes there is no point in changing it. However appleseed is one of the rare renderers with spectral capabilities and we will continue pushing and extending this mechanism over the releases to come. In particular the next release will extend the wavelength range to 380-780 nm and should feature a much improved RGB-to-spectral conversion algorithm.
We also removed a lot of settings from Frame entities that didn't make much sense anymore. appleseed's internal framebuffer is now always linear, unclamped, premultiplied and uses 32-bit floating point precision.
Finally, appleseed.studio is now using OpenColorIO for color management. There is now a dropdown menu in appleseed.studio to choose a display transform (an Output Device Transform in ACES terminology) at any point during or after rendering:
appleseed.studio will populate this dropdown menu with transforms listed in OpenColorIO's configuration on the user's machine.
In addition to the procedural objects and procedural assemblies described above, this release also allows to extend all "scene modeling components" of appleseed via external plugins written in C++. This includes cameras, BSDFs, BSSRDFs, EDFs, lights, materials, textures, environments... There are technicalities that make this work incomplete in this release; we will gradually fix them as the need arises.
We now include Substance Painter export presets in the share/
folder inside appleseed's package.
Maps exported using these presets can be used in appleseed-maya and Gaffer with appleseed's new standard material:
Our Google Summer of Code 2017 student Gleb Mishchenko implemented a Python console inside appleseed.studio, our scene tweaking, rendering, inspecting and debugging application:
Here is a closeup of the Python console loaded with a simple script that prints the name of all objects in the scene:
appleseed.studio also supports Python plugins: any directory with a __init__.py
file inside studio/plugins/
will be imported and registered at startup. There are example plugins in samples/python/studio/plugins/
.
Our motivation behind the Python console is threefold: simplify and speed up the development of appleseed.studio; let users extend appleseed.studio using Python plugins, and let users automate tedious tasks using one-off Python scripts directly inside appleseed.studio, with immediate feedback on their scene.
This work is still very much in progress: our Python API is quite low-level and not as comfortable to use as it could be, and we have yet to bundle PySide to let scripts extend the user interface and add menus, panels and dialogs (however we did play with this and it already works on Linux).
Details of Gleb's GSoC project can be found in his report.
Artem Bishev, another of our GSoC 2017 students, took upon himself the rather intricate task of implementing physically-based volumetric rendering in appleseed.
In this release, appleseed supports fully raytraced single-scattering and multiple-scattering inside homogeneous participating media. Two phase functions are currently supported: a fully isotropic phase function and the Henyey-Greenstein phase function.
A volumetric bunny in the Cornell Box (spectral render):
This work is very much in progress. Main upcoming features are heterogeneous media and OpenVDB support.
Details of Artem's project can be found in his report.
Petra Gospodnetić, our third GSoC student this year, spent her summer implementing a new light sampling strategy in appleseed based on Nathan Vegdahl's Light Tree algorithm. The Light Tree is a simple yet clever unbiased sampling algorithm that picks better candidate points on area/mesh light sources.
To illustrate the benefits of the Light Tree, here is a comparison between the existing light sampling algorithm and the new one on a scene making heavy use of mesh lights (both images were rendered with 16 samples/pixel):
And here is the converged render:
Full details of Petra's work can be found in her report.
--to-stdout
mode to send rendered tiles to the standard output (in binary).--continuous-saving
mode since it fundamentally could not work reliably.settings/appleseed.cli.xml
.This minor release fixes appleseed.python (appleseed's Python bindings) on macOS.
These are the release notes for appleseed 1.7.0-beta.
The following people contributed code for this release, in alphabetical order:
We made deep improvements to our ray traced subsurface scattering implementation. The new code gives more accurate and consistent results. We also took this opportunity to add support for SSS sets:
(On the left, each object is in its own SSS set; on the right, all objects are in the same SSS set.)
We also modified the Gaussian BSSRDF to expose the same parameters as other BSSRDF models. This makes finding the right BSSRDF model much easier.
Finally, we added Fresnel weight parameters to all BSSRDF models and we exposed the Gaussian BSSRDF to OSL.
We added a few new controls to appleseed's unidirectional path tracer:
Low Light Threshold allows the path tracer to skip low contribution samples (thus saving the cost of expensive shadow rays) without introducing bias. By default the threshold is set to 0 (not skipping any sample) but we might change it in the future.
Separate diffuse, glossy and specular bounce limits, in addition to the existing global bounce limit.
appleseed now features a physically-based plastic BRDF:
(Coffee Maker scene by Blend Swap user cekuhnen, lighting and material setup by Benedikt Bitterli, converted to appleseed using our new mitsuba2appleseed.py tool.)
We added OSL shaders that expose some of appleseed' built-in BSDFs such as the Disney BRDF and our physically-based Glass BSDF, as well as Voronoi texture nodes (2D & 3D). We're continually growing and expanding this shader library.
In addition, we have now high quality OSL implementations of many Maya shading nodes. They are used in appleseed-maya, our soon to be released integration plugin for Autodesk Maya, to translate Maya shading networks to appleseed.
*.appleseedz
).<parameter name="flip_normals" value="true" />
.NDC
and raster
OSL coordinate systems.*.appleseedz
file.updateprojectfile
by projecttool
, a new command line tool with more features.mitsuba2appleseed.py
to convert Mitsuba scenes to appleseed.This is a preview version of appleseed 1.7.0-beta.
These are the release notes for appleseed 1.6.0-beta.
The following people contributed code for this release, in alphabetical order:
Note: We are providing builds relying on the SSE 4.2 SIMD instruction set. Please contact us if your machine does not support SSE 4.2 and you would like a build relying on SSE 2 instead.
matrix[]
and int[]
parameters support.background()
closure.g
) parameter from appleseed subsurface closures.Object
and MeshObject
methods to Python bindings.appleseed 1.5.1-beta is a new build of appleseed 1.5.0-beta that restores Windows 7 compatibility.
There are no other changes.