I implemented this by defining a color range over which to vary with the angle between the eye and each fragment's normal.
Options:
- irrStartCol is the color applied for any angle with dot product >= the irrThreshold value.
- irrEndCol is the end color for dot product of 1.0 (direct view of fragment). The colors are lerp'ed in hsv space so you get a nice variation along hue. Setting a lower saturation for the starting value gives a nicer fade-in of the color.
- I would have like to implement a low/high thresholding rather than just low/1.0, since it seems irridescence in nature can sometimes (usually?) rely on more oblique viewing angles, and direct view angles may not necessarily irridesce (is that a word?).
- The irrRampExp is the exponent for ramping in the effect starting at irrThreshold, so you can soften or harden the starting colors
- The irrWhiteOnly option applies irridescence only to very white fragments, so you can get the regular textured clothes with irridescent gloves and eye whites (not perfect), if the 'useTexture' option is enabled.
I figured you wanted just the implementation of taking a lit sphere image and using it to shade the object, rather than the full implementation of the referenced paper which worked on creating the sphere mappings from artwork.
Different simple shere files can be loaded, I just made these in photoshop. I didn't have time to get the sphere file displayed in the render.
Simple fisheye effect. You can change the amount of the effect with the 'fisheye level' option. I'd like to know how to smooth the result, but maybe it's not possible without doing an second post-render pass for smoothing?
I did this as a post shader to easily get uv's for the whole screen, but this means the hatching is wed to the screen position rather than the object. Maybe I could use the projected position in regular fragment shader because using the texture uv's gives odd behavior based on the objects texture mapping. But I think it look ok for quick work.
There are two hatching patters, and each has these options:
- spatial frequency in x & y, to create diagonal lines. For the second hatch, the phase of the y component's contribution is reversed to get the diagonal in the other direction.
- 'scale' is overall scale of the hatching pattern relative to the other
- 'noise' is a simple noise factor that offset phase
Implement at least 75 points worth of shaders from the following list. We reserve the right to grant only partial credit for shaders that do not meet our standards, as well as extra credit for shaders that we find to be particularly impressive.
Some of these shading effects were covered in lecture -- some were not. If you wish to implement the more complex effects, you will have to perform some extra research. Of course, we encourage such academic curiosity which is why we’ve included these advanced shaders in the first place!
Document each shader you implement in your README with at least a sentence or two of explanation. Well-commented code will earn you many brownie (and probably sanity) points.
If you use shadertoy or any materials as reference, please properly credit your sources in the README and on top of the shader file. Failing to do so will result in plagiarism and will significantly reduce your points.
Examples: https://cis700-procedural-graphics.github.io/Project5-Shaders/
- Tone mapping:
- Linear (5 points)
- Reinhard (5 points)
- Filmic (5 points)
- Gaussian blur (no double counting with Bloom)
- Iridescence
- Pointilism
- Vignette
- Fish-eye bulge
- Bloom
- Noise Warp
- Hatching
- Edge detection with Sobel filtering
- Lit Sphere (paper)
- Uncharted 2 customizable filmic curve, following John Hable’s presetantion.
- Without Linear, Reinhard, filmic (10 points)
- With all of linear, Reinhard, filmic (10 points)
- Customizable via GUI: (5 points total)
- Controlling Exposure
- Side by side comparison between linear, Reinhard, filmic, and Uncharted2 .
- K-means color compression (unless you are extremely clever, the k-means clusterer has to be CPU side)
- Dithering
Implement a dropdown GUI to select different shader effects from your list.
Propose your own shading effects!
Weave all your shading effects into one aesthetically-coherent scene, perhaps by incorporating some of your previous assignments!
main.js
is responsible for setting up the scene with the Mario mesh, initializing GUI and camera, etc.
To add a shader, you'll want to add a file to the src/shaders
or src/post
folder. As examples, we've provided two shaders lambert.js
and grayscale.js
. Here, I will give a brief overview of how these work and how everything hooks together.
shaders/lambert.js
IMPORTANT: I make my lambert shader available by exporting it in shaders/index.js
.
export {default as Lambert} from './Lambert'
Each shader should export a function that takes in the renderer
, scene
, and camera
. That function should return a Shader
Object.
Shader.initGUI
is a function that will be called to initialize the GUI for that shader. in lambert.js
, you can see that it's here that I set up all the parameters that will affect my shader.
Shader.material
should be a THREE.ShaderMaterial
. This should be pretty similar to what you've seen in previous projects. Shader.material.vertexShader
and Shader.material.fragmentShader
are the vertex and fragment shaders used.
At the bottom, I have the following snippet of code. All it does is bind the Mario texture once it's loaded.
textureLoaded.then(function(texture) {
Shader.material.uniforms.texture.value = texture;
});
So when you change the Shader parameter in the GUI, Shader.initGUI(gui)
will be called to initialize the GUI, and then the Mario mesh will have Shader.material
applied to it.
post/grayscale.js
GUI parameters here are initialized the same way they are for the other shaders.
Post process shaders should use the THREE.js EffectComposer
. To set up the grayscale filter, I first create a new composer: var composer = new EffectComposer(renderer);
. Then I add a a render pass as the first pass: composer.addPass(new EffectComposer.RenderPass(scene, camera));
. This will set up the composer to render the scene as normal into a buffer. I add my filter to operate on that buffer: composer.addPass(GrayscaleShader);
, and mark it as the final pass that will write to the screen GrayscaleShader.renderToScreen = true;
GrayscaleShader is a EffectComposer.ShaderPass
which basically takes the same arguments as THREE.ShaderMaterial
. Note, that one uniform that will have to include is tDiffuse
. This is the texture sampler which the EffectComposer will automatically bind the previously rendered pass to. If you look at glsl/grayscale-frag.glsl
, this is the texture we read from to get the previous pixel color: vec4 col = texture2D(tDiffuse, f_uv);
.
IMPORTANT: You initially define your shader passes like so:
var GrayscaleShader = new EffectComposer.ShaderPass({
uniforms: {
tDiffuse: {
type: 't',
value: null
},
u_amount: {
type: 'f',
value: options.amount
}
},
vertexShader: require('../glsl/pass-vert.glsl'),
fragmentShader: require('../glsl/grayscale-frag.glsl')
});
BUT, if you want to modify the uniforms, you need to do so like so: GrayscaleShader.material.uniforms.u_amount.value = val;
. Note the extra .material
property.
- Create a
gh-pages
branch on GitHub - Do
npm run build
- Commit and add all your changes.
- Do
npm run deploy