Cultura e Scienza del Colore - Color Culture and Science


14 Cultura e Scienza del Colore - Color Culture and Science | 04/15  

1Walter Arrighetti, PhD
walter.arrighetti@gmail.com

1CTO  |  Frame by Frame Italia  

Motion Picture Colour Science and 
film ‘Look’: the maths behind ACES 
1.0 and colour grading  

1. INTRODUCTION

One of the biggest colour-related problems 
for the film production and post-production 
industry is two-fold: to ensure that the creative 
“look” of video content, as envisioned by the 
cinematographer, is preserved throughout ([1]–
[2]), and to be able to consistently reproduce 
this ([1], [3]). That also has to be independent 
both on digital cameras or computers generating 
and animating it (as input), and on finished asset 
specifications for the end-users to watch and 
enjoy it (as output) — be it either in a dark 
digital cinema theatre, at a home TV setting, 
or using a Video-on-Demand (VoD) or Internet-
streaming application, in a day-lit room or even 
open sunlight.
In recent years many proprietary/commercial 
tools and workflows emerged, each driven by 
specific, not always cross-compatible needs (e.g. 
on-set grading, Digital Cinema mastering, VoD, 
etc.). This results in proliferation of a plethora of 
different formats vs. the scarce number of really 
interoperable standards.
The author has put lots of efforts to provide a 
unified mathematical formalism and usability to 
most of the colour-management technologies 
used in the post-production world, both on 
independent publications and in collaboration 
with several entities in the business (like 
the SMPTE and the AMPAS). After a minimal 
introduction to such colour-mathematical 
terminology ([4]–[6]) and ColorLUTs, two brand 
new colour-management techniques from high-
profile moving-picture digital imaging (CDLs and 
ACES) will be described, as they aim at colour 
interoperability for the analysis and synthesis of 
digital ‘looks’, both on-set (production) and along 
the Digital Intermediate (DI) phase.
ACES in particular, which the author has 
been active contributor to since 2012, is an 
Academy-originated initiative for facilitating 
colour interoperability across the Media & 
Entertainment industry.

2. COLOUR SCIENCE  
    MATHEMATICAL FORMALIST

A gamut mapping between colour spaces ([5]–
[7]) is a vector field L(c), where c∈G is the input 
colour in the source gamut G⊆IRm (which is, to 
every practical aspects, a connected, linearly- 

and superficially-connected m-dimensional 
domain—often even a convex one), with dim 
L(G )=n. Let the input and output spaces be 
both RGB model (m=n=3) and their canonical 
bases be the left-handed triple {r,g,b}, so 
any input colour c∈IR3 is coordinated as 
c=rr+gg+bb and, for the input regular RGB 
cube, (r,g,b)∈[0,1]3. The output colour is thus 
L(c)=Rr+Gg+Bb; by the Hodge-Helmholtz’ 
theorem, the orthogonal decomposition holds 
([4]–[5]):

 

where T(c) and H(c) are the conservative 
(curl-free) and the solenoidal (divergence-free) 
parts of the colour map, each derived from 
a potential field — a scalar one χ(c) for the 
former, and a vector one η(c) for the latter. 
Due to simple connectedness of G, no harmonic 
component is present in the above: the constant 
‘lift’ term l represents an overall colour bias 
(either neglected or incorporated into T) for 
chromatically-additive colour models like sRGB, 
as well as ciE XYZ and dci X'Y'Z'.
The gradient of the gamut mapping can also 
be considered, which is a more complete (and 
complex) mathematical object, called a (n,m)-
tensor field, [4], depending on both the source 
m channels and the target n channels. T is 
the generalized tonal mapping, or transfer 
characteristics which, in RGB spaces, models 
overall colour correction (incl. lightness and 
saturation changes). H is the field describing 
local colour-component cross-talks and global 
hue shifts. Notably, T field too may incorporate 
hue shifts, especially for those colours c, where 
the inter-channel ratios are not preserved, i.e. 
R(c) : G(c) : B(c) ≠ r : g : b.
Colour-correction languages often use the luma-
chroma colour model (e.g. the La*b*, Y'UV, 
Y'cBcR and Y'cX'cZ' spaces), or the cylindrical 
colour model (e.g. the HSL space). In the case 
of HSL for example, the author usually suggests 
joining hue h and saturation σ together into 
a complex parameter called chroma ς and 
defined as, [5]:

(where arctan2 is the secondary arc-tangent, 
which is reminiscent of quadrant allocation 

ˆˆ ˆ( ) ( , , ) ( , , ) ( , , )
( ) ( ) ( ) ( )

R r g b G r g b B r g b
χ

= + +
= + = ∇ + ∇ × +

L c r g b
T c H c c c lη



 1504/15 | Cultura e Scienza del Colore - Color Culture and Science

“run” the algorithm, than applying more complex 
mathematical formulæ.
Third, a cluT can hardly be interpreted only 
by specific software able to read its encoding 
and is useful only on specific picture(s) it was 
intended for: quite a black-box ingredient for the 
motion-picture recipes. The latter aspect may 
have been advantageous in the past, but is now 
mostly a downside, when cinematographers, 
colourists and VFX artists really need to transfer 
colour corrections from the on-set pre-grading 
sessions throughout the whole post-production 
pipeline, up to the theatre room, and capable of 
doing so in the most advantageous and, above 
all, interoperable way (cfr. §5). Moreover, lots of 
workflows with so different and “undisciplined” 
uses of cluTS exist —be it either for technical 
and creative intents— such that no generalized 
use can be made of a cluT as long it is tailored 
for a specific project. It is hard to invert (i.e. to 
“reverse-engineer”) the mathematical operations 
baked into a cluT, especially for post/VFX 
labs which do not enforce a thorough colour 
management across their pipelines. That is even 
worsened when materials from different sources 
(camera makes, film emulsions, CGI rendering, 
…) come all together in place.
Discrete-calculus tools allow the extraction of 
quantities essential for the analysis or synthesis 
of a colour transformations: estimating colour 
differences, hue shifts in degrees, boundary 
wedges for evaluating Out-of-Gamut (OoG) 
colours, etc..
When technical problems of higher level arise 
in colour correction (e.g. colour characterization 
of specific input or output devices, or proper 
gamut mappings between footage with 
different colorimetry), this usually translates 
into more sophisticated mathematical tools to 
be employed, often derived from Differential 
Geometry, Harmonic Analysis and multi-
dimensional interpolations, [4]. 
For example, a more careful shaping of a 
tone-scale curve is usually necessary when 
modelling the transfer characteristic of a non-
digital device, e.g. sensor noise or the sought-
after 35mm film print emulation (FPE, cfr. Fig.2): 
three control points as provided by a CDL ([11]) 
or a 3-way color-corrector (CC) are no more 
enough and the three channel functions R(r), 
G(g) and B(b) need to stay non-decreasing (i.e. 
invertible). This helps better trim the effective 
contrast on all the tonal ranges. When the three 
functions are uneven with each other, a hue shift 
inevitably occurs, as the hue is not preserved by 
the same input triple (r,g,b) any more. Imposing 
hue-invariance means adding constrains that 
need to be correctly formulated), i.e.:

ˆˆ ˆ( ) ( )
B G R B G R
g b b r r g

   ∂ ∂ ∂ ∂ ∂ ∂ 
∇ × = ∇ × = − + − + − =    ∂ ∂ ∂ ∂ ∂ ∂    

L c H c r g b 0

for b* and a* and commonly found in all the 
programming languages).

3. COLOUR LOOK-UP TABLES, 
  AKA COLORLUT, 
  AKA CLUT, AKA SIMPLY 'LUT'

While the above formalism is useful for technical 
operations like colour space conversions and 
“overall” colour corrections not done on a scene-
per-scene basis, more complex transformations 
may be needed, especially when creative intent 
is included, [8]–[10]. In this case a closed-
form formula for the transform might not exist; 
interpolation, though existing in principle, might 
not be computationally compatible with the 
creative need of real-time playback of high-
resolution (often uncompressed) image/video 
files (essential for evaluating and creating the 
so-called grades). For this reason the continuum 
formalism may be well abandoned at this stage, 
and replaced by a ColorLUT (colour look-up 
table, or cluT, [6]) which is a discrete-mapping 
representation of it on a finite, m-dimensional grid 
of N points per input colour channel, with each 
point being a n-tuple in the output colour space 
(e.g. in the shape of a N×N×N RGB cube). It is 
an explicit mapping between a sample of input 
code-values into output code-values (which 
may or may not act between the same colour 
space), while the result on intermediate source 
colours is obtained by interpolation [3]: for this 
reason a cluT can be also used to approximate 
continuum formulae as those for mapping a 
colour space into another (e.g. from a RGB one 
like Rec.709, [10], to ciE XYZ, cfr. Fig. 2c). A 
type of cluT approximates the mapping channel 
by channel (therefore called 1d-luT, or colour 
curve in different contexts); another type acts 
as a full orthogonal sampling of input colours; 
the latter —because it usually maps between 
3-channels colour spaces— is specifically 
called a 3d-luT  whereas, mathematically, it is a 
discrete n-dimensional vector field L(s), where 
s∈G is the input colour codevalue of the source 
m-channel gamut G⊆IRm (cfr. several samples in 
Fig.1, where m=n=3).
The reason why the cluT implementation is so 
widely used is manifold: first of all, provided 
the appropriate density of N input value per 
channel is used (17×17×17 samples in Fig. 2 
is a common, but yet not enough coarse-grained 
choice), it can represent any non-linearities in 
the colour transform (accounting from the most 
complex primary colour corrections, up to a 
35mm film’s dye cross-talk, as is the case for  
3d-luTS). Secondly, it is implemented via simple 
(and usually linear) interpolations on the other 
non-sampled colours, fairly scaling with the cluT 
size N, and has therefore a smaller footprint in 
terms of CPU power and memory size needed to 



16 Cultura e Scienza del Colore - Color Culture and Science | 04/15  

Figure 1 - Plots of the output gamut 
L(Γ) of 3d luts L whose input is the 
173-points RGB cube G:  a. identity 
mapping; b. colour-space conversion 
between HDTV’s “Rec.709” and 
Cineon Printing Density (CPD) 
“logarithmic” RGB spaces; c. from 
“Rec.709” (gamma γ=2.6) to Digital 
Cinema (DCI) CIE XYZ colour space; d. 
from CIE XYZ to DCI’s P3 RGB colour 
space (γ=2.2 – notice the clipping at 
the cubic gamut boundary of P3);  e. 
from Cineon Printing Density log. RGB 
to CIE XYZ colour space; f. scene-
specific creative Colour Grading LMT 
including 35mm print-film emulation.

Figure 2 – Two different views of 
output gamut L(Γ) of a Print-Film 
Emulation (PFE) clut L engineered by 
the author (Technicolor laboratories, 
Rome, 2009), showing the synthesis 
work done adding additional points 
to the gamut of a Kodak Vision film 
in order to expand its latitute prior to 
35mm scanning.



 1704/15 | Cultura e Scienza del Colore - Color Culture and Science

More often simpler definitions of hue and 
saturation are used though, like

which allows to have simpler analytical 
properties like

Imposing hue-invariance, means in this case 
solving the algebraic equation 

hue L(c) = hue (c) ⇔ 

Another important constrain that is sometimes 
necessary, is the existence-of-inverse condition. 
This is especially important to guarantee that, 
once a grade is ‘burned’ within the raster pixels, 
original colours can still be recovered without 
degradations.
It’s worthwhile noting that current post-
production tools only ‘burn’ a colour grade as the 
last stage of the process (earlier the information 
on the grades is carried along the pipeline as 
metadata-only by the colour-correction software). 
This is formulated as in [2]:

21
sat( )

2
gb rb rg= − − −c c

2
3( )

hue( ) arctan
2

g b
r g b

−
=

− −
c

2
1

sat( ) 2
sat( )

2

r g b
g b r
b r g

− − 
 ∇ = − − 
 − − 

c
c

sat( ) 3 8
hue( )

sat( ) sat( )
∇

∇ = =
c

c
c c

2 2
G B g b

R G B r g b
− −

=
− − − −

det 0
( , , )

R R R
r g b
G G G

r g b r g b
B B B
r g b

∂ ∂ ∂
∂ ∂ ∂

∂ ∂ ∂ ∂
= ≠

∂ ∂ ∂ ∂
∂ ∂ ∂
∂ ∂ ∂

L

4. ON-SET COLOUR GRADING             
    AND ITS COLOUR LANGUAGE (CDL) 

Since things are now shot digitally and stay 
digital throughout the pipeline, one of the movie-
chain blocks to recently take advantage of this 
is principal photography, where early colour 
correction/grading can be done effectively, on-
set, just minutes after each clip is shot, cfr. [11], 
Fig.3.
Creative colour correction (grading) information 
can be transported, from clip to clip, as they are 
originally shot, as a series of simple non-linear 
transformations controlled by 10 parameters (3 
parameters by each of the 3 RGB channels, plus 
1), each representing one degree of freedom 
of the creative colourist: ‘slope’, ‘offset’, ‘power’ 
triples, plus a ‘saturation’ parameter. A set of such 
quantities, transported for a whole video asset, 
cut per cut, makes up the 2014 OSCARS®-
winning American Society of Cinematographers’ 
Colour Decision List (ASC CDL) and is a well-known 
example of simple mathematical equations 
at the creative service of motion picture 
colourists [5], [11]. This can also be re-written 
by means of three functions S (slope), O (offset) 
and P (power) and let s, o, p be the respective 
controlling parameters with identity values 1, 0, 
1 respectively:

• Slope S(c;s)=sc (CDL analogue for 
colourists’ lift, despite slope fixes black 
point at codevalue 0.0, whereas lift 
fixes whitepoint at codevalue 1.0);
• Offset O(c;o)=c+o (CDL analogue 
for colourists’ gain);
• Power P(c;p)=max(0,s)p, which 
is the CDL analogue of a gamma-
correction.

Figure 3 – Example of commercial 
colour grading software GUI with the 
main 3-way colour-corrector wheels.

( )( ) ( )( )
( )( )

GR B

R R G G B B

R R R G G G

B B B

ˆˆ ˆ( ) ( ) ( ) ( )
ˆ ˆ( ; ); ; ( ; ); ;
ˆ( ; ); ;

pp po s r o s g o s b

P O S r s o p P O S g s o p

P O S b s o p

= + + + + +

= +

+

L c r g b

r g

b



18 Cultura e Scienza del Colore - Color Culture and Science | 04/15  

This is a non-orthogonal decomposition in 1st- and 
2nd-degree polynomials (slope + offset), plus a 
nonlinear function (power), thus the inner products 
may not “behave” well. It can be shown however 
(and this is in fact well-known practice done by 
every non-mathematician colourists working 
on still or moving pictures), that a sufficiently 
low number of such operators, governed by a 
few parameters (like weighting coefficients in 
a linear combination for Linear Algebra) allow 
for quite good approximation, thus leading to 
orthogonal decompositions of colour operators. 

where all the 1-parameter vector fields lk are 
known a priori, whereas the coefficients and 
the parameters  k themselves are the real 
descriptors of the “look”.

5. THE ACADEMY COLOR CODING  
    SYSTEM (ACES)

The Academy of Motion Picture Arts and 
Sciences (AMPAS)’s Science and Technology 
Council has been gathering a group of variegate 
experts from all the top-level production, post-
production facilities and software houses in the 
industry to put forward a solution unifying such 
colour management issues: ACES, [1], [12].
The reason behind ACES is the need to 
particularly address the plethora of colorimetries 
set by manufacturers’ digital equipment (both 
image-creating and -reproducing) — even many 
more so in the digital era than film processes 
ever had in the past. A similar need has already 
surfaced in the Digital Cinema industry: that 
is why its dci X'Y'Z' colorimetry derives from 
the device-independent ciE XYZ colour space 
([3], [10]). Unfortunately, as neither colorimetric 
cameras nor monitors/projectors exist as of yet, 
this colour-space choice has lead to reverting 
to a one within the RGB model, which is more 
practical, as well as ACES mostly pertains to TV 

Figure 4 - Sketch of the ACES 
paradigm: the original scene is 
either captured by a real camera 
or generated in CGI. Whatever the 
source, the corresponding Input 
Transform converts the code-
values into the SMPTE2065 colour 
space (except for the “ideal” RICD, 
which already produces SMPTE2065 
pictures). Using the Output Transform 
the pictures can then be transferred to 
any output device, like monitors (with 
any technologies), projectors, TVs, etc.

and moving pictures. Every colour-correction 
operators in the involved pipelines (from camera 
controls, to colour-grading suites, to projectors’ 
and TVs’ balance controls) are, in fact, RGB-
based.
Version 1.0 of ACES, [12], whose project the 
author has been cooperating on with the 
AMPAS experts since 2012, is a framework 
with centralized colour-management paradigm, 
developed after many years of pre-testing 
among facilities and companies in the industry, 
where the image is evaluated according to its 
colorimetric digital representation. Please refer 
to Fig.4 for a schematic throughout this Chapter. 
First of all, ACES defines AP0 and AP1: two 
sets of RGB primaries for the four ACES colour 
spaces. AP0, whose ciE xy chromaticities are 
(0.73470,0.26530) for Red, (0.,1.) for Green 
and (0.0001,0.0770) for Blue. AP1 primaries’ 
chromaticities are (0.713,0.293) for Red, 
(0.165,0.830) for Green and (0.0128,0.044) 
for Blue. Both use ciE D60 illuminant 
(0.32168,0.33767) as white-point and physical 
blackpoint at ciE XYZ triple 03.
Within ACES colour pipeline the image is 
considered as virtually captured by a Reference 
Input Capture Device (RICD), which is an idealized 
digital ‘camera’ recording in a RGB colour space 
called SMPTE2065 after the standard that defines 
it. Another important aspect is that SMPTE2065 
is a scene-referred colour space, i.e. the code-
values represent mean relative exposures to the 
one captured from a perfect reflecting diffuser 
— apart from a 15% glare. In AP0, this accounts 
for a normally-exposed 18% grey card acquired 
by a RICD mapped to the RGB triple (.18, .18, 
.18).
Any real camera imagery and colorimetry is 
brought into the pipeline by means of a colour 
gamut mapping called ACES Input Transform, 
which basically converts all the camera’s 
colorimetry into SMPTE2065. Currently, Input 
Transforms for most of the patented, cinema-
grade cameras like the ARRI AlExA, the cameras 

1
( ) ( ; )

c

k k
k

θ
=

= ∑L c l c



 1904/15 | Cultura e Scienza del Colore - Color Culture and Science

Figure 5 - Real-world ACES testing to 
compare ARRI Alexa XT (shown here), 
RED™ EPIC and Sony F55 cameras 
on the same technical set and ACES 
colour space (also courtesy of DIT E. 
Zarlenga).

Figure 6 - Main ACES v1.0 
components in place: footage shot 
in Alexa camera’s native ARRIRAW 
frame-per-file format (in Log.C colour 
space) is technically processed by an 
Input Transform to become scene-
referred SMPTE2065 colour space. 
This is where color grading is applied 
upon (in temporary ACEScc space). 
The CDL designed during principal 
photography (on-set) pre-grading is 
conceptually applied above this grade, 
but below optional creative-technical 
transforms like PFE. The file is then 
ready to be saved in ACES-compliatn 
OpenEXR sequence (SMPTE2065 
colour space) or can be sent to a 
display device passing through an 
Output Transform. 

by RED™, the Fx5 family by Sony, the cameras 
by Blackmagic Design, and the Cinema-EOS™ 
family by Canon, are provided; each maps 
the sensor’s proprietary gamut (called ARRI 
Log.C, RED.Log, S-Log/S-Gamut, BMD.Log 
and CanonLog  respectively), parametrized by 
shooting settings like equivalent sensitivity (ISO) 
or correlated colour temperature (CCT), into 
scene-referred SMPTE2065 codevalues.
The author has also been active in Italy for 
promoting the use of ACES with several 
initiatives, [1], including a real-world, on-set test 
to compare ACES freamework originating from 
different, high profile cameras, up to a full VFX 
and Digital Cinema mastering pipeline: Fig.5 is 
the result of the technical photography session.
At the other end of the pipeline, SMPTE2065 
colorimetry is converted to the gamut  of the 
displaying device and chromatic adaption by 
means of an ACES Output Transform: among 
the others there is one, for example, for Digital 
Cinema mastering (dci P3) in a dark surround, 
two for standard broadcast TV gamut (Rec.709), 
and two for UHDTV (Rec.2020) — each having 
one for a bright- and one for a dark-surround 
adaption. From a Colour Appearance Model 
(CAM)’s perspective, the Output Transforms 
take care of the viewing environment as well: 
so several Output Transforms may exist for the 
same device, but under different chromatic 
adaptions.  All the Output Transforms have a 
common first mathematical block, called the 
Reference Rendering Transform (RRT). Please refer 
to Fig.5 for a block-diagram of ACES version 1.0 
main components.
All in all, SMPTE2065 space uses AP0 primaries, 
has trivial transfer characteristics (i.e. it is 

photometrically linear, i.e. “gamma-1.0”)  and 
represents the baseline for all the ACES pipeline 
— and the widest gamut as well, which is 
also suited for long-term archiving, cfr. Fig.6. 
Codevalues are usually encoded as 16 bits/
channel floating-points (‘half-floats’ as per IEEE 
754-2008 standard), and archived in a specific 
frame-per-file variant of the OpenEXR file 
format, cfr. [14]–[15].
It is within this space that images are mainly 
worked on, with exceptions when it is technically 
convenient or mandatory to use temporary, well-
defined colour-spaces for specific purposes:

• ACEScc has AP1-primaries, 
“logarithmic” transfer characteristic, 32 
bits/channel float encoding optimized 
for film-style colour correction, [13];
• ACEScg has AP1-primaries, 
photometrically-linear, 16 or 32 bits/
channel integer code-values, optimized 
for CG and painting applications that 
scarcely support images represented 
by floating-point codevalues, [16];
• ACESproxy has AP1-primaries, the 
same logarithmic characteristic as 
ACEScc, 10 or 12 bits/channel integer 
encoding, optimized for real-time 
transport of images over physical 
links (e.g. the SDI cables family) that 
only support integer code-values, yet 
logarithmic encoding is still needed for 
on-set color correction applications, 
[17];

Similarly, output-device compatibility is 
provided by first mapping to another ideal 



20 Cultura e Scienza del Colore - Color Culture and Science | 04/15  

output Reference Display Device (RDD) via the 
aforementioned RRT, and, whence, by means of 
a discrete mathematical formula called Output 
Device Transform (ODT), which depends on the 
output colour space and, ultimately, on the output 
device’s transfer characteristics (e.g. monitors, 
projectors, printers, D-Cinema devices, etc.).
ACES images are stored in frame-per-file 
ordered sequences, encoding each frame as a 
OpenEXR file [14], together with ACES-specific 
metadata optionally written as well in a “sidecar” 
XML file called ACES clip-container.
Ideally, any sensitive colour operation (both for 
technical and creative intent) should take place in 
either the SMPTE2065 or the ACEScc colour spaces 
(which act like a PCS in the ICC paradigm), where 
any operator acts unambiguously. Creative-
intent operations, in particular, are stored in the 
so-called Look Modification Transform(s) (LMT), 
which are applied before the Output Transform.

6. CONCLUSIONS: 
    THE COLOUR 'LOOK' OF A FILM

Several professionals in the video, post-
production and DI world, as well as colour 
scientists and vendors, have long tried to define 
what technically “look” means in this context. In 
the author’s opinion, science, experience and 
common practices can sum up together to the 
statement that currently a “look” is the ensemble 
of creative colour decisions made for a specific 
set of scenes (e.g. scenes shot to represent the 
same lighting and dramatic situation, if not even 

possessing location/temporal unity) that neither 
pertain the technical properties of the colours 
themselves nor the devices/media used to 
reproduce them. In this sense “look” is different 
from “film look”, as the latter also includes colour 
characteristics due to combination of a film’s 
emulsion, development and printing processes 
(which can of course be emulated).
A look applied to two differently-exposed and 
-coloured scenes not to chromatically match 
them (that’s what the same-name phase of a 
colour correction session is about), but rather 
to give both the same visual impact, in the 
director’s and/or cinematographer’s minds, 
the incisive ‘colour fingerprint’ unique to that 
specific product and cinematography; the ‘look 
development’ phase has therefore been starting 
earlier and earlier in the production phase, up 
to taking place on-set, with the help of proper 
pre-grading workflow (e.g. the one proposed by 
Technicolor DI supervising colourist Peter Doyle 
for Harry Potter VII, Dark Shadows and Paddington 
full-feature films).
The lack of interoperable schemes, incompatible 
file formats and colour operations —even a 
unified terminology— has prevented many 
colour manipulations from currently happening, 
or at least made this much more difficult and 
prone to errors or lack of precision. This cannot 
be any more delayed since all film processes 
have turned completely digital, and this is 
where the author’s contributions have been 
focusing on in the latest years. Unifying the 
post-production terminology and mathematical 

Figure 7 - Chromaticity comparison 
between ACES (SMPTE2065) gamut 
other well-known RGB colour spaces.



 2104/15 | Cultura e Scienza del Colore - Color Culture and Science

formalism (which were traditionally tied up to 
different post-production laboratories’ own film 
processing technologies and manufacturers’ 
secret sauces) means creating a common 
baseline to start from and communicate with 
mainstream Color Science. ACES is another step 
up in the attempt to create common processes 
and workflow to ease interoperability and help 
to future-proff archival footage. Of course all of 
this is a process, therefore it is not meant to be 
set in stone but rather continually progress as 
new technologies, methodologies and, above all, 
creative eyes and minds turn up bringing along 
their expression techniques and visions — for 
this process to drive them all along together.

BIBLIOGRAPHY

[1] W. Arrighetti, “The Academy Color Encoding System 
(ACES) in a video production and post-production colour 
pipeline”, Colour and Colorimetry, XI(B), Maggioli, 2015.

[2] W. Arrighetti, “New trends in Digital Cinema: from on-
set colour grading to ACES”, at CHROMA: workshop on 
colour image between motion picture and media, chroma.
di.unimi.it, Sept. 2013

[3] W. Arrighetti, “Colour Management in motion picture 
and television industries”, Colour and Colorimetry, VII(B), 
63–70, Maggioli, 2011.

[4] W. Arrighetti,  Mathematical models and methods 
for Electromagnetism in Fractal Geometries, Ph.D 
dissertation, Sapienza University of Rome, Rome, 2007.

[5] W. Arrighetti, “Colour correction calculus (CCC): 
engineering the maths behind colour grading”, Colour and 
Colorimetry, IX(B), 13–19, Maggioli, 2013.

[6] W. Arrighetti, “Moving Picture Colour Science: the 
maths behind Colour LUTs, ACES and film ‘Looks’”, Color 
and Colorimetry, VIII(B), 27–34, Maggioli, 2012.

[7] S. Westland, C. Ripamonti, Computational Colour 
Science using MATLAB. Wiley, 2012.

[8] Colorimetry, Second Edition, Commission Internationale 
de L’Éclairage, Publication 15.2, 1986.

[9] M. Petrou, C. Petrou, Image Processing: the 
fundamentals, Wiley, 2012.

[10] C. Poynton, Digital Video and HD: algorithms and 
interfaces, Morgan-Kaufman, 2012.

[11] A.B. Benitez, L. Blondé, B. Lee, J. Stauder, H. Gu., “ASC 
CDL: a step towards Look Management”, Proceedings of 
IBC 2007.

[12] Academy Color Encoding Specification (ACES), 
Society of Motion Picture and Television Engineers 
(SMPTE), Standard 2065-1, 2012.

[13] ACEScc, a Logarithmic encoding of ACES data for use 
with Color Grading systems, Specification S-2014-003, 
Academy of Motion Picture Arts and Sciences (AMPAS), 
v1.0, Dec. 2014.

[14] OpenEXR File Layout, OpenEXR working group, www.
openexr.com, Apr. 2007.

[15] ACES Image Container file layout, SMPTE, Standard 
2065-4, 2013.

[16] ACEScg, a working space for CGI render and 
compositing, S-2014-004, AMPAS, v1.0, Dec. 2014.

[17] ACESproxy, an Integer Log encoding of ACES image 
data, S-2013-001, AMPAS, v2.0, Dec. 2014.

[18] F. Pierotti, “The colour turn: l’impatto digitale sul 
colore cinematografico”, Bianco e nero, LXXV(580), 26–34, 
Carocci, Rome, Sept.–Dec. 2014.