Friday, 14 December 2012

Week 12: 14/12/2012

Lecture:  

In today's lecture our lecturer wanted to go over anything we felt like we needed to go over.  However, no one really wanted to, so he told us to take half an hour to relax and be ready for the test.

Lab:  

In today's lab, we sat the final test.  It was done on the UWS moodle virtual educational environment.  I got 12/20.


Week 11: 07/12/2012

Lecture:  

In today's lecture, we prepared for next week's final test by answering questions fired at us by our lecture.

Lab:  

Today I created the video as was requested by my lecturer.  This was the class task for today.

Here is the link for the video, and the video itself.

https://www.youtube.com/watch?v=e1fkhvNE-Ak


Week 10: 30/11/2012

Lecture:  

Note:  This lecture has not been made where the text and images are merged into one image.  This lecture has been made separately, so I can type more for this one.  

The human eye is a very sensitive organ and can play tricks on its host.  Many famous pieces of art have been drawn/painted and when looked upon, look as though they are moving.  For example:  (See figure 1)

Figure 1 - Still Image - Does It Look As Though It Is Moving To You?  
It was believed that when the human eye see's an image, it holds that image (as if imprinted) for a 25th of a second, and if another image was produced in that time or less from the first image being shown, it would look like a moving image.  This is called the persistence of vision, which is an old idea.  It is now called the myth of the persistence of vision.
"A more plausible theory to explain motion perception (at least on a descriptive level) are two distinct perceptual illusions: phi phenomenon and beta movement."  -  From PowerPoint lecture  

As said, if one image is produced after another in less than a 25th of a second, this gives the illusion of movement.  (See figure 2)  


Figure 2 - Frames That Are 0.04 Seconds Apart  
If a video file was to be saved in an uncompressed format, this could mean you would need one of the biggest storage spaces (if not the biggest) in the world.  




"Uncompressed  HD Video files could be large. say 3bytes per pixel, 1920x1080 by 60 frames per second = 373.2 Mbytes per second.
i.e. approximately 1Gbyte every 3 seconds.
"  -  From PowerPoint lecture  

Even to today's standards, that is far too much storage.  So this is why many compression algorithms and standards are able to be used/to be put in place and reduced greatly in size.  

The term bit rate refers to the number of bits per second that are used to represent a video file (or at least any portion of a file that is video).  The range of bit rate can go from 300Kb per second to 8, 000Kb per second.  The higher the bit rate, the better the quality of a video.  
"Interlaced video is a way to make the best use of the limited bandwidth for video transmission, especially in the old days of analogue broadcasts. The receiver (your TV) "tricks" your eyes by drawing first the odd number lines on the screen 25 times per second. Then the even lines of the next frame and so on. Progressive video does not interlace and appears sharper."  -  From PowerPoint lecture

Resolution refers to the number of pixels that can be represented on the display device.  The higher the resolution, the more pixels that can be represented.  

There are many different video formats.  A website that shows a list of them and a description of them is:  http://www.libtiff.org/video-formats.html

The more you compress a file, the more data the files losses.  Due to compression artifacts, images become more distored.
For algorithms that are made to compress video predictively can still have issues with fast-paced, unpredictable, detailed motion (like sport).
The solution to this could be automatic video quality assessment.

As said before, image degradation can occur due to too much compression.  But it can also happen because of other factors.  (See figure's 3 and 4)

Figure 3 - Distorted Image Due to Lossy Compression - Compressed Image  

Figure 4 - Distorted Image Due to Camera Lens Blurring - Blurred Image  
Is it possible for a computer to decide if an image is of good quality?  (See figure 5)

Figure 5 - Different Qualities of the Same Image, With Their Quality Values?  
The thing that holds the key - statistical algorithmic video processing.


Lab:  

In the lab, my lecturer asked the class to look at the tutorial video's on the Adobe website for their product Adobe Premier Pro CS4.  This was in preparation for the video we are to make using clips and audio given to us.

Week 09: 23/11/2012

Lecture  

Why Use Digital Image Processing?  


  • No use of darkroom  
  • Reduced time to produce a photo  
  • No noxious chemicals  
  • Perfect for experimenting time and again to achieve a certain desired effect  
  • The variety of options available for things that can be done to an image when digitally processed are far greater than that of darkroom processing  
    • Image manipulation  
    • Enhancement  
    • Transformation operations  
(See figure's 1, 2, 3 and 4)  

Figure 1 - Digital Imaging Systems  

Figure 2 - Digital Camera Image Capture  

Figure 3 - Sensor Spatial Resolution  

Figure 4 - Digital Camera Colour  
The light collected by the lens is passed through an optical low-pass filter before it is focused onto the sensor array.  This optical low pass filter serves to:  

  • Exclude any picture data that is beyond the sensor's resolution  
  • Compensate for false colouration and RGB Moiré caused by drastic changes of colour contrast  
    • As in pictures of thin stripes and fine point sources  
  • Reduce infrared and other non-visible light which disturbs the sensor's imaging process  
What Is Moiré and How Do You Prevent or Remove It?  (See figure 5)  

Figure 5 - Before and After - Removal of Moiré  
"Moiré is a repetitive pattern of wavy lines or circles that occasionally appear on objects in digital captures. It occurs when fibers or fine parallel details in an object match the pattern of the imaging chip in the camera. Some camera companies incorporate anti aliasing filters in the cameras to slightly blur the tiny detail areas of captures, but other manufacturers prefer not to use them because they don’t want to sacrifice image sharpness. With or without the filtering, every digital camera is capable of creating moiré."  -  From PowerPoint lecture  

(See figure's 6 and 7)  
Figure 6 - Digital Image Fundamentals  

Figure 7 - The Pixel  
 Bit-Mapped Graphics  

An ordered array of groups of bits.  That is the definition of a bit-mapped colour image when represented in a digital memory.

The colour of a single pixel on the screen is coded by these single groups.  For example: 8 bits are needed for each of the red, green and blue levels.  This means 24 bits are needed per pixel.

"The array of pixels could be 640 x 480 (i.e. VGA spatial resolution), with the colour of each pixel represented by a group of 24 bits requiring,  

640 x 480 x 24 = 7372800 bits  

i.e. approximately 7.4Mb per image.  1 Mb = 10bits"  -  From PowerPoint lecture  

(See figure's 8, 9, 10, 11 and 12)  
Figure 8 - Dynamic Range  


Figure 9 - Example of an Image Using 1-Bit Colour Depth  

Figure 10 - Example of 8-Bit Depth - Grey Scale  
Figure 11 - Example of 8-Bit Colour Depth  

Figure 12 - Example of 24-Bit Colour Depth  

Colour Palette  

If a computer system and pre-determine and establish what colour palette is to be used, then the system palette colours (for example 256) are used for all images.

By selected the closest 256 colours closest to those colours in an image is how the fidelity of an 8-bit (256 colour) image is often enhanced.

When using a system that can only display 256 colours, displaying multiple images simultaneously can become a problem when using the adaptive palette.

Whether or not the palette is appropriate for certain images, the system MUST choose one palette and apply it to all images shown.  (See figure's 13, 14 and 15)

Figure 13 - Example of a Non-Optimised Palette of 256 Colours  

Figure 14 - Example of Optimised Palette of 256 Colours  

Figure 15 - Example of 256 Colour Palettes  

What is Digital Image Processing?  

"Four common categorisations of DIP operations are analysis, manipulation enhancement and transformation.  

Analysis operations provide information on photometric features of an image e.g. colour count, histogram.  

Manipulation operations change the content of an image e.g. flood fill, crop.  Input image yields output image.  

Enhancement operations attempt to improve the quality of an image in some sense e.g. heighten contrast, edge enhancement.  Input image yields output image.  

Transformation operations alter the image geometry e.g. rotate, skew.  Input image yields output image."  -  From PowerPoint lecture

(See all figure's below) - Refresher note for blog readers: I would type out a lot more in my own words, but when there is a picture on the PowerPoint lecture slide, I just put the who slide in the blog because the whole slide is a picture all in its own right, not a slide with words and a picture.  

Figure 16 - A Typical Image Processing System  
CCD:  Charged Coupled Device  
ADC:  Analogue-to-Digital Converter
Figure 16 - Analysis - The Histogram  

Figure 17 - Histogram - A Good Contrast Image  

Figure 18 - Histogram - Low Contrast Image  

Figure 19 - Histogram - High Contrast Image  
Figure 20 - Transformation - Rotate  

Figure 21 - Transformation - Free Rotate  

Figure 22 - Manipulation - Block Fill  

Figure 23 - Enhancement - Ex 1 Depth  

Figure 24 - Enhancement - Ex 1 Depth - Large Depth of Field Original  

Figure 25 - Enhancement - Ex 1 Depth - Small Depth of Field Original  

Figure 26 - Enhancement - Ex 1 Depth - Simulated Small Depth of Field  

Figure 27 - Second Example of Enhancement  

Figure 28 - Enhancement - Ex 2 - Original  

Figure 29 - Ex 2 - Processed  

Monday, 10 December 2012

Week 8: 16/11/2012

Lecture:  

Light is a form of energy that is detected by the human eye.  Depending on the conditions, light either appears to behave like a stream of particles (photons) and in others like a wave.  (See figures 1 & 2)

Figure 1 - How Photons Move - http://www.cs.utexas.edu/~mechin/photon_mapping/img/photon_map.jpg  
Figure 2 - Light Behaving Like A Wave and A Particle at the Same Time -  http://static.neatorama.com/images/2012-11/wave-particle-quantum-physics.jpg  
"Damn you, quantum physics! Just as we got used to the mind-boggling fact that light can act as either a wave OR as a particle, a new quantum physics experiment has shown that it can act like a wave AND a particle at the same time:
Now, for the first time, researchers have devised a new type of measurement apparatus that can detect both particle and wave-like behavior at the same time. The device relies on a strange quantum effect called quantum nonlocality, a counter-intuitive notion that boils down to the idea that the same particle can exist in two locations at once.
"The measurement apparatus detected strong nonlocality, which certified that the photon behaved simultaneously as a wave and a particle in our experiment," physicist Alberto Peruzzo of England's University of Bristol said in a statement. "This represents a strong refutation of models in which the photon is either a wave or a particle.""  -  
http://www.neatorama.com/2012/11/06/Quantum-Physics-Experiment-Shows-Light-Behaving-as-a-Wave-and-a-Particle-Simultaneously/ - 



Quantum Physics Experiment Shows Light Behaving as a Wave and a Particle Simultaneously

 •   
Unlike sound, a medium is not needed for light to propagate.  This can be proven through the example of sunlight.  Sunlight travels to the Earth through space.  However, there are no mediums in space for any type of wave to travel through.  This proves that a medium is not needed for light to travel.  
Light waves are transverse in nature.  (See figure 3)  

Figure 3 - Transverse Wave - http://www.passmyexams.co.uk/GCSE/physics/images/transvers_waves_001.jpg  


Transverse waves of light are electromagnetic waves.  This in turn gives the conclusion that light is a particular form of radio wave and thus our eyes must be a form of radio receiver.  
The range of light visible to humans is a small part of the electromagnetic spectrum (See figure 4).  The frequencies that a human can see in this spectrum are 400THz to 750THz (THz stands for Tera-Hertz.  A Tera-Hert is 10₁₂Hz).  
These frequencies are perceived by the human eye as many different colours, with red at the lowest of frequencies and purple at the highest.  White is produced when a mixture of these frequencies (colours) are mixed together.  A perfect example of this is a rainbow.  
The reason a rainbow appears after rain has fell and the sun then shows is because when white light from the sun passes through the falling rain drops, all the frequencies that are mixed together producing the white light are then split.  When they split into their separate frequencies, a rainbow is produced with red (the lowest frequency) at the top and purple (the highest frequency) at the bottom.  (See figure 5)  

Figure 4 - The Electromagnetic Spectrum  
Figure 5 - Separate Frequencies of Light and the Result of Mixed Frequencies - http://msprinsendam.files.wordpress.com/2010/11/picture11.png  


A rainbow goes in the order red, orange, yellow, green, blue, indigo, violet.  When the white light that is produced by the sun splits, it mainly takes the form of the primary colours, but secondary colours are produced because of overlapping frequencies.  For example red, red and yellow, yellow.  But red and yellow make orange.  So thus the first part of a rainbow is  red, orangeyellow.  With this being said however, each secondary colour still does have its own frequency component.  Light that has the frequency 500THz looks orange.  (See figure 6)  

Figure 6 - Rare 'floating rainbow' - http://www.dailymail.co.uk/sciencetech/article-2155361/Rare-flying-rainbow-brightens-skies-southern-China.html  


In a vacuum (somewhere like space) light travels at a constant speed of 300, 000 km/s.  It travels at a lower speed (but not much lower) in air.  However, in glass the speed is reduced by about 1/3.  The change in velocity in the air/glass interface is why glass makes a useful material for lenses.  
The speed of light in air is about 1 million times larger than the speed of sound in air.  The speed of light in air is 300, 000, 000 m/s.  The speed of sound in air is 340 m/s.  
As known from previous lectures (as well as standard grade and higher physics) the frequency of a wave is the number of complete cycles every second.  The medium in which light travels through does not affect the frequency of the light wave.  
Also known from previous lectures (as well as standard grade and higher physics) is that the speed (velocity) of a wave is the wavelength of the wave multiplied by its frequency.  
In regards to the relationship between frequency, wavelength and velocity, the formula velocity = frequency x wavelength shows that if velocity changes, either wavelength or frequency must change also.  The wavelength will change as frequency does not change when passing through a different medium.  (See figure 7)  

Figure 7 - Wavelength Changes With Velocity - How the Frequency Does Not Change Even If the Medium It Travels Through Does  


Going back to the subject of white light and its components, here are a few of the slides from the lecture's PowerPoint.  (See figures 8, 9, 10, 11 and 12)  

Figure 8 - Components of "White" Light  

Figure 9 - The Visible Light Spectrum  

Figure 10 - The Visible Light Spectrum - 2  

Figure 11 - The Visible Light Spectrum - 3  

Figure 12 - The Sun's Spectrum  


Waves with frequencies a little less than the human eye can perceive (waves with a wavelength longer than 700nm) are known as infrared (IR).  An example of this is compact disc (CD) lasers, which operate at 780nm.  
Waves with frequencies a little higher than the human eye can perceive (waves with a wavelength shorter than 400nm) are known as ultraviolet (UV).  
Energy in the sunlight at the Earth's surface contains about 50% of electromagnetic waves visible to the human eye, 3% ultraviolet and the rest is infrared.  
Many animals have different seeing capabilities from humans.  For example: a dog cannot see colour; bee's can see UV; Pit Viper's can see IR.  
Monochromatic light is very rare.  Most light sources do not make use of it.  One that does however is a street lamp.  It allows us to effectively see in shades of grey and thus the world around us would appear as if we were completely colour blind.  The image below basically shows text of what I have just said, but it also has two pictures on it that are useful in further explaining monochromatic light, which is why the picture has been placed in this blog.  (See figure 13)  

Figure 13 - Narrow and Broad-band Light  


To measure brightness of light, scientists use two measurement units known as candela and lumen.  
"Photographers don't need to use these measurements because they use light exposure meters that are calibrated (ultimately) against a scientific standard."  -  From the PowerPoint lecture  
An objects brightness is a quality that is very subjective and depends on its reflectance, colour and that of its surroundings, as well as the illumination and the user's eye's adapted state.  

Environmental Effects  

Light waves from sources in our surroundings are transmitted, reflected, absorbed, scattered and refracted by the atmosphere and objects around us.  
As in a mirror or a bright point, specular reflected light may appear as an image.  An example of this is the highlight on a mirror ball.  
The type of light that shows no image or highlights (for example, the reflected light from a white sheet of paper) is diffuse reflected light, which scatters in many directions.  
"Adjacent objects may cast shadows on each other, or tend to colour one another other by their reflected light."  -  From PowerPoint lecture  
In the viewer's perception, an image scene may be altered by the manipulation of the above effects.  (See figure's 14, 15, 16 and 17)  


Figure 14 - Transmission and Reflection  

Figure 15 - Specular Reflection  
Figure 16 - Diffuse Reflection  
Figure 17 - Examples of Reflections Using a Harley and a Rider's Helmet  

An example below show's what should be done to accommodate the lighting conditions at sunset.  (See figure 18)  

Figure 18 - To Accommodate Sunset Light  

The final images below go on to explain how refraction works.  (See figures 19, 20 and 21)  



Figure 19 - Refraction  

Figure 20 - Refraction Continued  

Figure 21 - Refraction Continued  

Conclusion  
"Using our knowledge of the properties of light and its modification due to the environment, we can use the power of modern digital image manipulation software to create mood and environmental effects or to compensate them.  
"Thus we can alter the viewer's perception of the image scene, to achieve a desired effect."  -  From PowerPoint lecture.