Video Player is loading.
Current Time 0:00
Duration -:-
Loaded: 0%
Stream Type LIVE
Remaining Time -:-
 
1x

Video Transcript:

Hello everyone, welcome to this remote sensing and GIS course.
There will be 24 lectures on remote sensing and GIS in this course of which this is the
first one, you can see the modules details here and more details can be found in the
website.
So this course will introduce you to state of the art concepts and practices of remote
sensing and GIS.
It starts with the fundamentals of remote sensing and GIS and subsequently advance method
will be covered.
I hope you will enjoy this course.
Before we start, let us answer few questions like what is remote sensing?
What do we measure in this remote sensing or through remote sensing?
And what kind of output or data we generate from this remote sensing?
So let us start from very basics like what is an image?
So you can see this photograph like this is captured from a normal camera and normally
we call them image.
You can see this is another image generated from a normal camera or may be DSLR.
Here you can see this is another image and here you can see the coverage is more and
generally this kind of photographs are captured using some airborne sensors or may be satellite
sensors.
Here you can see this is another image produced by a satellite.
And here this is another image from satellite.
So what exactly is the difference, you can compare from the beginning, like this.
So here basically you can understand like these images are generated either from a normal
camera or from a satellite.
So there mode of acquisition is different.
So here these 2 are the different modes through which we capture the images.
So basically image is a pictorial representation of an object or a scene.
We have 2 different types of images, first one is analog, another one is digital.
So you can see here the left side house is basically a sketch, which has been produced
using a paper, pencil or pen, whereas the right hand-side image is captured through
a sensor or may be a normal camera or may be through a mobile phone camera.
So the difference between these 2 is, first one is analog and second one is basically
a photograph captured by a sensor.
So here this is the definition or this is how we understand what do you mean by analog
image?
So analog images are produced by photographic sensor on paper based media or transparent
media and variation in scene characteristics are represented as variation in color or gray
shades.
So it depends on our capability, how well we can depict or we can represent the variation
of an area or of an object, whereas in camera it depends on the technical specification
of that camera, how well it can acquire or how good images it can produce?
Basically digital image are produced by Electro-Optical sensors, you can see one example here and
this is actually you can see through your mobile phone, you might have captured many
images or selfie, if you zoom those images you will find that there are small rectangular
arrays.
So those arrays are basically or those small pixels are basically the numbers which have
been captured by your sensor or camera.
And if you import them into MATLAB or may be in C and if you can see those values, you
can see they are arranged in a regular manner and where each pixel is having a associated
values and object reflecting more energy or the object which is appearing very bright
in that image that will have higher number, whereas the darker portion or the darker areas
will have lower values.
So when you have a histogram, so from this image you can easily generate a histogram
and which will look like this.
It will have X and Y axis.
In X axis you will have digital numbers which has been captured by your satellite or sensor
and in Y axis you will have frequency.
So for a Gaussian distribution you may be knowing this that you will have this kind
of distribution.
But when you generate a histogram from an image captured in a natural setting, what
will happen?
This histogram will appear like this or may be it may appear like this.
So here you can see that this is the Gaussian distribution center one, but in the right
hand-side this maximum values is skewed or may be it has concentrated in the lower range
whereas here in the left hand-side, it is shifted towards the higher values.
So let us say that this particular image, whatever we have consider to generate this
histogram, it is having a resolution between 0 - 255, this is also 255, this is also 0
- 255.
So by looking at this histogram which has been generated from this satellite data or
may be from a normal camera, you can easily find out what is the brightness and contrast
of this image.
If this image is having this Gaussian distribution, so that means it is occupying the full range
or available range of this radiometric resolution, but whereas in this image, this has occupied
only this particular range.
And in this image it has occupied, so if you see the right hand side image, this image
will look darker.
This one will look darker, whereas this will appear brighter, because it is towards the
higher values.
So now you have understood what do you mean by an image?
And how it can be generated?
So it does not matter whether you have used a normal camera or a sensor or a satellite
through which you have captured an object or an area and you have generated an image.
So remote sensing is the art and science of making measurements about an object or the
environment without being in physical contact with it.
Remember here, I have written environment also.
So what do you mean by environment, because normally when we take a photograph through
our camera, regular camera, so we captured the objects or we captured our self.
But how do you capture the environment that you will understand slowly.
Remote sensing is the art and science of making measurements about an object or the environment
without being in physical contact with it.
So the best example of remote sensing is our eyes.
We are actually actively engaged in the remote sensing through our eyes and our eyes sensitivity
is limited to visible range starting from 400 - 700 nanometers, 400 - 500 is our blue,
500 - 600 is our green, 600 - 700 nanometer is our red wavelength.
So whatever we see that is the combination of this blue, green and red wavelength.
So in general accepted meaning refers to instrument based techniques in remote sensing.
This, I hope your familiar with this.
So here we have Radio waves, Microwaves, Infrared, Visible, Ultra violet, X-rays and Gamma rays.
So here you can see the information which can be captured using this electro-magnetic
wavelength range.
Here you can see this is the atmospheric window that means these wavelengths are actually
allowed to pass through our atmosphere.
So you can see the radio-waves this is allowed and in microwave and infrared certain portion
is not allowed.
Even in visible some portion are not allowed and if you see the next information that is
electro-magnetic waves, third one is wavelength you can see, then size comparison, what kind
of information we can get from all these wavelengths.
So starting from a nuclei to a building you can always use this technology, then frequency
is also related to this and then temperature energy for emission.
Like if you see, in this case, sun is our sources of light.
So when sun is illuminating our surface, then what will happen?
Either it will get reflected from our surface or it will get transmitted or it will get
observed or if it is observing some amount of energy, then later on it will be emitted.
So in visible it is 0.4 micrometer to 0.7 micrometer.
In VNIR and SWIR wavelength range it is starts from 0.7 micrometer to 2.5 micrometer and
this comes under reflective domain.
In the next one, when there is a absorption of this incident energy and object has to
maintain the equilibrium with the surrounding, so what will happen?
There will be some emission so those emitted energy will be captured through our sensor.
So this is the thermal infrared wavelength which starts from 3 micrometer to 16 micrometer.
Then next one is microwave, its range is 0.1 centimeter to 1 meter.
And what happens here, the sensors or the satellite it has it is own source of light.
So it will eliminate the surface and the back scattering energy will be captured here.
So what happens, when light hits an object, so this is actually very basic information
which I am giving you, but this is very important in order to understand this remote sensing
technology?
So here you can see there is an object and there is a source and light is coming from
the source to this target and the light has interacted.
Then what will happen some amount of energy will get reflected.
Some will be scattered, some will be absorbed and some will be emitted by this object.
So and finally transmitted, so if you add all these energies like absorbed, transmitted,
scattered, emitted and reflected what will happen?
It will be equal to your incident light.
Internal atomic structure and composition are the reason of absorption feature.
Because these energies are coming from a source to a target then, what will happen?
There will be some characteristics of this particular object, which is causing this particular
energy to get changed in different forms.
So if you understand the reflected, emitted, scattered, absorbed or transmitted energies
you can understand the target.
So if you have the sensor and if you can measure the reflected or emitted energy, then your
information which has been captured this will look like this.
So here in this case, the measurement is basically values, we are not generating any image, so
there is an instrument which can measure emitted energy at a regular interval for a given target.
So here using that you will have may be 100s, 1000s or may be 10000s data point captured
by the sensors and then easily you can draw them wavelength verses that value.
Here these are the areas where you can see, there is some change.
And remember, because this is the material characteristic why we are getting all these
troughs.
So here it is important that these troughs need to be studied thoroughly.
Because if you do not study, you will lose the information and that is why I have mentioned
troughs are the place where things are happening and distribution of electromagnetic radiation
emitted or absorbed by the particular object, it is a function of wavelength.
So always remember a material can be characterize using this information provided their absorption
feature is actually identified at a particular wavelength region.
And how do you plot such a spectrum?
That we have already discussed, you have a sensor which can captured these energies emitted
or reflected energy at regular interval and that will be stored in some digital number
and that we are displaying here.
So there are various stages in remote sensing and especially when we are talking about space
borne satellites.
So here you can see there are stages listed over here, electromagnetic energy reflected
or emitted by the objects.
So first of all sun is our source and then it is getting reflected or emitted from the
surface.
And then it is reaching to our satellite and then satellite will record this energy and
then there is a ADC analog to digital converter which converts this incident energy into some
values and those values will be recorded and image will be generated and then finally those
stored values and the captured image that will be transmitted to our ground station.
And then from ground station it will be distributed to users through different media.
Nowadays, a link is sufficient to distribute your data.
So this is again, this to make clear, that sun is our source of energy which is irradiating
our surface and which actually interact with our atmosphere and in atmosphere we have different
gases, aerosol and those will play a role to stop this energy in particular wavelength
region.
So here what is happening, the energy which is coming from sun to surface that is getting
changed or modified because of our atmosphere.
And some of the energy will come directly to our surface and once it get reflected from
the surface again it will interact it has to pass through our atmosphere and then it
will reach to our satellite and there could be this process also, where light is coming
directly and it is getting modified in this particular atmospheric window and may be some
of the energy will get reflected directly from atmosphere and which will be added to
our this reflected energy.
So this is one of the process.
In the other process where emitted energy is involved, so some of the energy which has
got absorbed here and it will get reflected after sometimes.
So those energies will be emitted and this will reach to our sensor and there will be
some atmospheric emission also and then finally you have a field data collection.
So here it is important to understand, what are the different types of remote sensing
we do, as well as how we do this?
So there are 2 different types of remote sensing one is active another is passive.
So you can see here.
Here this is the example of passive remote sensing, where sun is our source and which
is illuminating our target or our surface and which get reflected from the surface and
then it is captured by our sensors.
So here in this case, the sensor uses the sun’s energy as source of radiation.
In the next, other case where source is actually carried by the sensor, so here sensor uses
it is own source of radiation.
So here you can see sources also coming from this particular sensor and then once it get
in reflected, it will be received by the same sensor after certain times.
So here this is the flight direction.
I hope this is clear active versus passive remote sensing because this is very important
that you will understand in the later stage.
Now, there are different types of orbit how these satellites are fixed or how these satellites
are monitoring our surface.
So this is one of the orbit polar orbiting satellites where orbit altitude is approximately
850 kilometers and which is actually more or less 14 orbits per day, you can see and
it is used for earth exploration as well as earth observation and the next one which is
very commonly used like geostationary satellites.
So these orbits are little bit farther.
So this altitude of this orbit is 35,786 kilometer which is approximately 36,000 kilometer and
one orbit in 24 hours.
The satellite appears to be fixed on the sky and looks at this same location of the earth
which is very clear from this image.
So here you see this particular satellite is always looking at one particular position.
So this is very good to monitor some kind of changes or regular monitoring of an area
but due to high altitude the spatial resolution is very less.
The spatial resolution I know it is again a new term for you, but just wait for few
more slides you will understand spatial resolution and this is used for earth exploration, weather
monitoring and communication.
The next one is low inclination orbit.
So orbit altitude is approximately 160 kilometer, orbital period is about 88 minutes provide
high receptivity of tropics, objects below 160 kilometer experience rapid orbital decay
and altitude loss.
So you can see this one, so here this is the example of low orbiting satellites.
Here, these are the different platforms from where we capture these remote sensing data.
So let us start with satellite.
So here you can see, this is fixed in or the orbit is fixed in space.
So this is called space borne satellites.
And there are some space shuttles.
They are also carrying some cameras and they are capturing the images.
In the airborne basically, we use aerial photography or airborne SAR.
We used different cameras which are operating in different wavelengths and which we attached
to our helicopter or flights or may be drones you can see here.
So here you can see there are different types or different modes of acquisition and depending
upon their altitude their nomenclature will be different.
Here, you can see this is example of airborne survey.
So this is how it is done.
So you can see this particular area which has been captured already and then this particular
craft is moving forward.
So and this is the active area where this flight is basically or this airborne sensor
is capturing the image.
So there are different types of application like site selection studies, natural resource
management, earth and planetary exploration, environmental monitoring, change detection
defense related activities, urban and rural development and planning, crop yield forecasting,
hazard zonation and disaster mitigation.
So here these are few examples.
These are not the complete list where you can use remote sensing and GIS.
So here I will give you some overview, like if you want to do some site selection for
your studies.
So here you can see this is one example where we are using this remote sensing data to monitor
or to build a new a house or to build a new mall for that, you can definitely used remotely
sensed data.
So this is one of the interesting application of remote sensing, where you can always find,
what is the path of your river?
So for Environmental Studies, you can always use this remote sensing data or for natural
Hazard monitoring, you can use this temporal remote sensing data.
So here you can see one of the examples, this glacier is moving.
Here you can see this area it is moving slowly.
So which has been captured through satellites over the years now, you can always identify
what is their direction and what is the mass which is actually coming in this flow, so
you can prepared for the hazards.
And for national security also, you can always monitor your border areas or you can see the
changes in the forest whether is there any forest encroachment or anything is happening,
so in inaccessible areas and preparing the land use land cover map.
So how much area is used for agriculture, how much for residential purpose and how much
for shopping purpose?
So those things you can easily find out using this satellite remote sensing data.
Flood hazard monitoring, so here you can see there are different zones which have been
highlighted, so normal, very high, high, moderate, low.
So these things you can always identify or you can study using this satellite remote
sensing.
River morphology, this is one of the actually very interesting problem, where you can always
find how much shift or how much this river has migrated from one place to another place,
you can see here.
Then for planetary exploration, so I would say remote sensing is the only available option
or available technique to explore the planets.
So here you can see some of my work which has been already published which I will share
with you over the time.
And this is basically one interesting thing; here I want to highlight earth, moon and mars
and how we are exploring this one.
So basically, we have been using this remote sensing since very long now, we are successfully
mapping this mars, moon and earth is always in the picture.
Strength of the remote sensing data or satellite remote sensing data, so here you have large
aerial coverage which you can get only from space, temporal images that means how frequently
you monitor an area.
Sensor sensitive to wavelength region and that is the best one that I would say because
our eyes are limited to visible range whereas these sensors are capable of measuring the
energy reflected, emitted, back scattered energies in other wavelength regions also.
Access to inaccessible areas, then earth and planetary exploration, so these are few strength,
there are many but these are few strength of satellite remote sensing which I wanted
to highlight.
There are 4 types of resolution which is actually considered in the satellite remote sensing
or may be airborne remote sensing data.
So here you can see spatial, spectral, radiometric and temporal resolution of any data.
So temporal resolution, this depends on the return time of the satellite and return time
is a function of the altitude at which the satellite is launched, higher the altitude
more circumference of orbit longer to orbit the earth, with the ability to tilt the camera
view revisit capability can be increased.
This I will explain you again, but here you can see this particular area has been captured
at 5.30, 8.30, 11.30 and 16.00.
This is another example for temporal resolution where red color represents vegetation.
So here in this case 17th Jan, you have more vegetation, here still you have vegetation
here color becomes dark, here again they are reducing and then in these 2 cases vegetation
is very less.
In the next example you have these annual changes, so you see 1995, 1996 1997, 1998
and 1999.
So every year the same area has been monitored using satellite data and you can easily find
out vegetation changes.
This is another example where you can see long-term change.
So from 1972 to 2002 were in the same area there is a new lake.
Now this is another example, which I have already showed you in an earlier slide.
So here you can see how this river is migrating from one place to another place that you can
always track when you have good temporal resolution of your satellite.
This is another example of temporal data where we can monitor our clouds.
So based on this basically we forecast our weather condition, whether cloud is moving
towards our place or away from our place.
Next one is spectral resolution, so here in spectral resolution, it refers to the number
and dimension of specific wavelength interval in the electromagnetic spectrum to which our
sensor is sensitive.
So here you can understand like if this is the, our wavelength starting from 0.4 this
is 0.5, 0.6 and 0.7 micrometer.
Now, you have a sensor which can capture only this particular wavelength.
So for this particular wavelength what will happen you will have a image for a given area?
In the next one, you have another image or the same sensor can produce 2 images for the
same area, but they uses different wavelength region.
So in the next one, this is the second one and this is the first one, this is first one,
now second one will be something like this.
So here this is the second image.
Now once you have this first image, second image generated from your satellite data and
this is the third.
So another image you have and this is the third one.
So now you just imagine for a normal camera, we always get only one image, but here what
I am telling you is, for a given wavelength you can have 3, 4, 10 or 100s of images.
So if you, if your sensor is capable of resolving only 0.4 to 0.5 micrometer wavelength range,
then they can generate one image for that area, the another set of detectors which are
capable of resolving this 0.5 to 0.6 micrometer wavelength range.
They can generate this second one and the third one likewise.
So once you have this kind of information then what will happen for this particular
case is consider these 3 are the bands.
So now onwards I will always call them bands.
So in these 3 images the first pixel of the lower bottom corner, so they are actually
representing the same area.
So they are looking from the top and they are generating for the same area.
Now, if you remember these pixels, these are nothing but the pixels.
So these pixels are basically having some values digital numbers.
So if you extract those values from these 3 images you can always plot these 3 values.
This for example, so they will look like this.
In case if your sensor is capable of resolving more number of bands here or smaller bandwidth,
so here 0.4 to 0.41 one image, 0.41 to 0.42 second image, 0.42 to 0.43 third image.
So likewise if you have several images here what will happen?
Your measurement or the values will increase.
So now you will understand what I am trying to show here.
Okay.
So this is, in case you have only 5 bands then your data will look like this.
I hope now you can understand this and when you have 100s of bands then what will happen
for one given area you will have n number of values then you can always generate smoother
spectra.
Because in this case what is happening here is missing, but here what is happening is
captured, because the wavelength difference or the wavelength range for one band to another
band is very less.
So first band is 0.4 to 0.41, second band is 0.41 to 0.42, whereas in this case you
have 0.4 to 0.5.
So that means this is actually low resolution.
So this is called low spectral resolution, this is called high spectral resolution.
So here you have more information about that target and remember the previous slide where
I have told you that these are the places where things are happening.
So we need to study their shape, size and position to identify the material characteristics
or composition.
In the next one you can see,
So here a spectral band is defined in terms of central wavelength and bandwidth.
So you always have a; remember this, this is for the 0.4 to 0.5 micrometer and here
this is the range, but there will be a central wavelength that will be 0.45 micrometer.
In the next one bandwidth, so bandwidth is basically this one, this is the bandwidth.
So in order to understand this spectral resolution, you need to understand what is central wavelength
and bandwidth?
The bandwidth is defined by a lower and an upper cut of wavelengths that is these values,
spectral resolution is lambda 2 - lambda 1 which describes the wavelength interval in
which the observation is made.
So this is for one image, so we have used one sensor which is capable of resolving the
energy only between 0.4 to 0.5 and that is generating a image and that image further
we will use for our application.
And delta lambda is basically is the full width at half maximum.
So here you need to understand one thing, that what I am talking about.
So here, this is your spacecraft.
Remember I am talking about the spaceborne satellites where this is the platform, so
this is the spacecraft.
Here we may have several sensors attached here, so in generic word, I can call it cameras,
so different cameras are attached here.
These sensors are basically generating the, so they are looking at our ground and they
generate one image.
So what I am talking here is for these particular sensors which are capable of generating images
in a particular wavelength range.
So basically now I will zoom this part, if you see this is our sensor or in general term,
it is camera.
So now in this camera, there are many detectors attached, so there are several detectors are
attached here.
So these array of detectors or these sets of detectors are sensitive to a particular
wavelength range and they are looking at our ground individually and then if you see the
complete thing, then they will generate one image and they are moving like this.
So once they are moving and they are continuously measuring these values they can generate one
image, so this is what happening here.
Then selection of bandwidth is a trade-off between the energy to be collected and spectra
shape of the feature to be observed.
So if your sensor is capturing this particular area and it is basically 0.4 to 0.5 this micrometer,
so here you can see this 0.4 to 0.5 micrometer energy which is coming from this surface will
reach to our sensor and that will be recorded so and then it will generate these pixels
values that we will again understand in the next slide, how these pixels have been generated.
So here the important point is, if what is the relationship between wavelength and energy?
Wavelength is less than energy is more, so in the lower wavelength region you have high
energy and in the higher wavelength region or longer wavelength region you have less
energy.
So this bandwidth is selected based on that, if you talk about the lower wavelength range
like visible, you have enough energy to be resolved by your space sensor.
But when you talk about thermal remote sensing your energy is very less.
So you need to increase the size of the ground pixels or in other word you have to increase
this wavelength range so that you will have enough energy to be recorded by your sensor.
So here this is the concept of FWHM, so here full width at half maximum because if you
plot the sensitivity of a detector what will happen?
This will be like this, but if you see in this area, basically it has very less sensitivity
but in this area, it has maximum sensitivity.
So if we say that this particular detector is sensitive from lambda 1 to lambda 2 that
does not make any sense.
So what we need to do is we need to identify or demarcate the effective sensitive area.
So this is the area where it is having very good sensitivity.
So we always report this FWHM for a given detector or sensor.
Now, this is there is another concept called panchromatic multispectral and hyperspectral.
So in panchromatic you have only one band image and here the bandwidth remember the
starting and end wavelength of your detector.
So the starting and end wavelength is very far from each other, so they will have enough
energy to be resolved, So that is called panchromatic.
So this is one band image like our normal camera, but in case of multispectral you have
4, 5 or 10 different bands.
So here you remember, whenever we say panchromatic, so it may be 0.4 to 0.7 and this must be the
range occupied to generate your panchromatic.
But in case of multispectral data, which is this one here to generate this, this particular
wavelength has been divided into different images.
So you may have 1, 2, 3 and 4 bands and that is captured between 0.4 to 0.7 micrometer
and likewise you have generated first, second, third, fourth and fifth image, so once you
have this 5 images, you can generate, you can take this first pixel of this and then
you can generate 1, 2, 3, 4, 5.
So here spectra will be like this, this is for example.
Now, but here you remember in multispectral we never bother about the in between gaps,
here we have some gaps.
So that is basically not resolved here.
In the next one, this is the hyperspectral remote sensing data where you have 100s or
1000s of bands which are contiguous in nature.
So, what do you mean by contiguous?
So here from 0.4 to 0.7 micrometer, let us take this example.
So this is first band, second band, third band and here you will have more, so like
that you will have more number of bands.
So here and these are uniforms, the bandwidth will be uniform.
So here the definition of hyperspectral includes contiguous nature in measurement, so contiguous
means there will be no gap between first and second image.
So this is very important, so when you have such data sets measured from satellite or
maybe through this airborne survey or from the lab based instrument, they are called
hyperspectral.
And why this spectral resolution is important?
Because when you have more number of values for a given area in or across the wavelength,
then you will have this kind of information you can see from here to here.
So for different material, you can see these features are changing.
So why these are changing, because of the material characteristics, so you can easily
find out what is the chemical composition of that material based on these spectra's.
Now, the next one is spatial resolution.
So it is a measure of the smallest angular or linear separation between 2 objects that
can be resolved by this sensor.
So here let me show you the figure, so here basically I hope you remember the detectors
in your sensor and it is looking to a particular area and this is basically generating your
pixel.
So this is the next resolution spatial resolution.
Here, we need to measure the smallest angular or linear separation between 2 objects through
our sensors.
So it depends what is the capability of my sensor, how much small area it can resolve
in terms of values and that will be recorded in terms of digital number and we always call
them pixels in images.
So if you see this image has been generated through a sensor, spaceborne sensor and the
resolution of this 23 meter by 23 meter.
That means 1 pixel of this particular image represents 23 by 23 meter area on the ground.
So this is how the spatial resolution is important.
So if this area, this pixel represents only one meter by one meter, then what will happen?
This image will be much clear than this, that you can understand here.
Now, the next one is 2.5 meter by 2.5 meter.
Next one is 0.5 meter by 0.5 meter.
So here now you can just compare all these images slowly how the information is increasing
when resolution is increasing.
So this is the importance and this is the significance of spatial resolution in remotely
sensed data.
Now here this spatial resolution is the projection of detector element on the ground through
optics this already I have explained to you but this will be more, clear now.
Now there are 2 terms IFOV and FOV.
So here IFOV is basically the angle of your detector, but field of view is the complete
angle of your sensor.
So here 1 pixel is generated using this IFOV.
So if you know the altitude of this flight, what is the altitude and what is the look
angle of your detector?
So if you know the height and IFOV, then you can always find out how
much area this detector will cover on the ground.
So that is the significance of this IFOV and FOV is basically, this explains how much area
it will cover all together and this is the swath.
Then next one is radiometric resolution.
It defines the sensitivity of a detector to differentiate the incoming radiation into
different labels.
So that means suppose if you have a torch, you have a torch and you are illuminating
a target and that reflected values are reflected energy is coming to our sensor and then sensor
is capable of differentiating only 2 values, 2 ranges of the values then what will happen?
There will be 0 or 1 either there will be some value or there will be no values.
So this radiometric resolution is very important in terms of the level of information which
has been captured by the sensor.
So here you can see this is 1 bit data, this is 2 bit data, this is 3 bit data and this
is 4 bit data.
So the incoming radiation was differentiated into 2 different levels, but here in case
of 2 bit data, you have 2 to the power 2 range, in case of 3 bit you have 2 to the power 3
ranges, in case of 4 bit you have 2 to the power 4 range.
So here you can understand when you have 1 bit data, 2 bit data, 3 bit data, 4 bit data
definitely your 4 bit data will give you more information about that area, because that
can depict the smaller change in the contrast or the values.
So here you can see 8 grey levels.
So when you are having 0 to 7 or maybe starting from 1 to 8, so 8 kinds of sets can be depicted
here, but in case of 256 you have 0 to 255 range or 1 to 256 range and that is why you
are having more information in this particular image.
So as of now, we have covered this spatial resolution, spectral resolution, radiometric,
resolution and temporal resolution and how they are important and in the next one, we
have also covered panchromatic multispectral and hyperspectral data and how they are different
from each other.
So now the next one is sensor technology how these images have been generated.
So I told you that this is the platform where you have your sensor in that sensor you have
detectors.
So for time being let us take only 4 detectors so, how they are imaging this particular area?
So how these areas have been captured in this particular image?
So how they are moving, so they might have started from here then move to this, but,
is it so or is there any other mode of acquisition so how do you image this particular area,
so that we will see here.
So there is a whisk broom imaging sensor or whisk broom imaging technology, where you
have this you can see how this image a particular area.
So it is something like sweeping your floor with a broom.
I hope this is clear.
So scanning is performed by an oscillating mirror deflecting upwelling radiation from
earth to onto wavelength sensitive photo detectors.
In the next one you have push broom when you are pushing your sensor, so this will be like
this.
So this is the push broom imaging technology, sensor consists of a linear array of detectors
equal in numbers to the number of pixels in a row of the image more stable compared to
whisk broom.
Now in push broom you also have this frame acquisition
and these are very good books.
Thank you.  
Auto Scroll Hide
Module NameDownload
Week_01_Assignment_1Week_01_Assignment_1
Week_02_Assignment_2Week_02_Assignment_2
Week_03_Assignment_3Week_03_Assignment_3
Week_04_Assignment_4Week_04_Assignment_4
Week_05_Assignment_5Week_05_Assignment_5
Week_06_Assignment_6Week_06_Assignment_6
Week_07_Assignment_7Week_07_Assignment_7
Week_08_Assignment_8Week_08_Assignment_8





Sl.No Language Book link
1EnglishDownload
2BengaliNot Available
3GujaratiDownload
4HindiNot Available
5KannadaNot Available
6MalayalamNot Available
7MarathiNot Available
8TamilNot Available
9TeluguNot Available