Star Earth Energies

How to interpret what is in a photograph 

How to Interpret a Photograph

It might seem an unnecessary thing to explain, but most people don't have an appreciation for what a photograph is.  Most will describe photography by a professional standard. 


However professional photographers don't think in colour or light - they think in terms of greyscale, of shades of grey.  Grey isn't a colour, neither is black or white.  Black and white, in art, when added to a colour, will produce either lighter colours or darker colours.  Likewise there is no grey with photons.

What is a photograph ?

In a simple explanation, photons are quantum particles, vibrating at certain frequencies, that can exhibit colour in the visible spectrum.  Analysis of the colour, is an indication of the photon frequency that the camera recorded. 

A photon is an elementary particle, the quantum of all forms of electromagnetic radiation including light. It is the force carrier for electromagnetic force, even when static via virtual photons.


The photon has zero rest mass and as a result, the interactions of this force with matter at long distance are observable at the microscopic and macroscopic levels. Like all elementary particles, photons are currently best explained by quantum mechanics and exhibit wave–particle duality, exhibiting properties of both waves and particles.


For example, a single photon may be refracted by a lens and exhibit wave interference with itself, and it can behave as a particle with definite and finite measurable position and momentum. The photon's wave and quanta qualities are two observable aspects of a single phenomenon, and cannot be described by any mechanical model; a representation of this dual property of light, which assumes certain points on the wavefront to be the seat of the energy, is not possible. The quanta in a light wave cannot be spatially localized.  Some defined physical parameters of a photon are listed. See Photons for more.  

A photon is a quantum particle, indicative of 'creation'.  As particles interact, collide or repel, more activity is produced.  Of all this activity, the majority occurs outside of our visual range.


Photons are represented as a wavelength operating at certain frequencies.  Some of those photons are visual.  What we see as light, and what we are able to see, are all photons vibrating at their own resonant frequency.  Colour, photonic light and wavelength enable us to see anything at all.


The colour can indicate composition, or it can represent interference, as in redshift and blueshift (more on that below).   

Photons are all about frequency and are measured by us in hertz.  Some are visual, some are audible, but most lie outside of our normal sight and sound range.


Frequencies capable of being heard by us are called audio or sonic. The typically range is between 20 to 20,000 Hertz.  This range of hearing varies, with a bottom end of 15Hz being accepted by others as the lowest range of human hearing, up to 25-28,000 Hertz at the top end of the range.  The sounds, or frequencies we cant hear are called ultra sonic or super sonic.


Frequencies capable of being perceived visually by us, operate roughly between 380 to 780 nanometres.  Even though we see colours, our eyesight is not measured by the quality of hertz frequencies we can see, but rather how far our eyes can see. 


Currently I  am in the process of converting this distance into equivalent hertz, and correlated with known healing frequencies and descriptions.  It is, however, an arduous task, with conflicting information about the colours and what their actual resonant frequency is.


Every colour and tone has a corresponding frequency, and every sound and tone has a corresponding frequency.  Both can be represented mathematically as numbers.  Colour and sound, tones and harmonics, are all aspects of understanding photons. But if we are to truly understand photons we need to look at elements and chemical reactions.  Many elements will produce varying spectral results, according to their state and their interaction with other elements.  This I will discuss in another article, for the time being we will stick to what we understand the best, our own sight and hearing.


In the image below you can see that our visible light range is very small compared to all that we understand exists.  You will also notice there is no white, or black, only the colours and shades of the rainbow.  This represents the range of light frequencies we can see - the colours of the rainbow.  I wonder, if you could see all the rest of the whole spectrum, what colours would be there?  

There is no white light.  What we perceive as white light is intense yellow.  Also observe how yellow is the middle point, on the left side is blue, on the right side is red. White light is a concept rather than a reality.  White brings to mind something solid, like we look at white walls, white clothes, white clouds.  But light is translucent, and all illuminating.  It takes away the dark and reveals the colour of life.  This is in direct contrast to those who have taken on this name with regards to their dark activities, ie. the Illuminati.  Do not let these tarnish what illumination of real light brings.


White light doesn't exist, but so much intense yellow exists that it appears, if we were to describe it, as white.  But you would also note that besides the Sun being always depicted as yellow, many ancient cultures talk of a golden light.  Many enlightened seekers are also meditating using the principal of golden light.  Perhaps gold was so much sought after in ancient times, and today, not because of it uses, but simply because it is the closest thing we have to representing the true colour of golden light.


If you were to be asked what colour is gold, you would say gold.  Some call it yellow, but then have to add metallic yellow, but then it's not yellow anymore then.  Gold is gold coloured, and the closest representation to the golden light.  Silver, you would think, if light was white, would be the most natural way to describe the Sun - but the Sun is golden and the Moon is silver.  This would suggest that the most intense energy is golden, and when reflected/divided it becomes silver.  But still neither of these lights have split enough to discern colour, only light.


As the light comes through our atmosphere is has already been through the rigours of the Van Allen Radiation belt and has survived and made it through.  In its journey through our atmosphere the light particles entered with a bang, then continue to interact with every other particle around.  As it comes closer to the lower atmosphere, more light from these other particle collisions is created.  Heat is the catalyst for change and as it falls towards the lower atmosphere it begins to cool and particle interaction slows.  It is now able to be bounced around in the clouds or any other interference like smoke or haze.


Light is a particle, of quantum size, that through it's interaction with other particles creates light.  Something so small and invisible, creates what is visible.  The Sun is one big photon maker, of such intensity that it has become visual.  Imagine how many photon's must make up the Sun for it to be constantly 'giving birth' to them.  And as such, must also be directly connected to Source as well, because photons come directly from Source.  If you might recall, we are also passing through a photon belt, decades wide, so how does that effect the Sun and this solar system? 


White light doesn't exist, other than as a term.  Our light is yellow, so intense it looks white.  If you look at the colour spectrum (which is a natural reflection of what we see in a rainbow), you will notice - there is no white, no black, nor grey.  The universe only 'sees' in colours and tones of colours.  There is only light, or the absence of light.


Blueshift indicates an increase in energy, redshift indicates a decrease in energy (or interference).

  • Blueshift is faster, closer together and higher in frequency.
  • Red shift is slower, further apart and lower in frequency.

Colour can represent the composition, interaction and interference of many things you can easily photograph in the night sky.  But a tripod is essential.  Your photos will be double exposed, blurred, as the camera captures you moving around while trying to take a still photo.  Tripods are essential, no matter how steady your hand, when you are focused right in on something, even breathing can move it out of frame.  Tripods are the only way to capture a photo of a moving object, rather than you moving it twice as much with instability.  Breathing is something you also become aware of, as in a cold night air you can soon fog up your own camera lens.  


Condensation is an issue, and can produce some amazing results, and I will show these further down.  These things all affect the end result in your photo and how reliable that data is.


Yellow light appears white simply due to the intensity of light.  If you can subdue that intensity, then you can see that yellow light is indeed the aura of the Sun.  Cloud is not enough to create this subdued intensity, as the water molecules do not block out enough light.  However, a smoke veil, consisting of small particles of airborne ash, can block the intensity enough to photograph the Sun.  

Conditions have to be right to get a blueshift naturally, but it can be easily artificially created with as little light as that given off by a mobile phone nearby.  The above images were all taken in the country, with no street lights and few neighbours.  My neighbours are in bed asleep, lights off, and the few specks of light you can see in the black foreground is actually moonlight reflecting off their roofs.


The natural conditions to give blue skies at night, requires cloud (to reflect the moonlight) and wind (the clouds have to be moving at a certain speed).  The cloud can't be too thick, otherwise it blocks the light altogether.  I have produced many photos this way.  Another condition too, is that it would appear to be more likely to occur at moon rise and moon set, when it is within that degree of angle affected by the Rayleigh Scattering.


The artificial conditions to give the blue skies is simple enough - any residual light from neighbours lights, reflections off pools or ponds, mobile phone lights, a torch, anything with extra light either nearby or in the camera's lens view.  You can photograph above your neighbours light and because it's not in the camera's lens view, if won't blueshift.  You need the light to be in view, or very nearby.  As you can see this is country, not the city, and light pollution is very low.


The pinks and purples however are not easily replicated artificially and to date I have been unable to do this.  There is good reason for that.  The pinks and purples are a result of the blueshifting light being blocked or reduced by cloud and it is redshifting the blueshifted light.  To replicate this artificially would required sensitive equipment that can incrementally decrease and increase the secondary light source at very subtle rates.  Nature does this for free, even if it's not always easy to get the redshift.  Many times I have photographed the blueshift, hundreds, both naturally and artificially.  This is completely verifiable by anyone and repeatable over and over.  





As you can see, white in the middle, but subdue it and you can see yellow going to orange - the intensity has decreased and is redshifting.    As a child, we all drew the Sun yellow, and that is the exact colour of sunlight, yellow, intense yellow.


From yellow, if the sunlight is blocked by thicker smoke, then it continues to redshift, until it has the appearance of looking pink - and if you recall your art classes, adding white to red makes pink - in light standards the redshift is made pink because of the source of the light, adding 'white' to the redshifted sunlight.  


The Sun is still sending out yellow light, but local atmospheric conditions are blocking US from seeing it - the density of ash particles that are decreasing the sun's ability to penetrate our atmosphere to the point where we can't see it.  From our perspective the sunlight has redshifted, but outside of local conditions the Sun is still shining brightly.  


This information can be used and understood in several ways.  The sunlight has not decreased, but local atmospheric conditions are blocking the light.  As a result of the light being blocked, intensity is decreasing and the colour is redshifting.  However, the sunlight hasn't decreased in intensity, as it is still pumping out strong rays.  From our perception it is redshifting, but the sunlight itself is not redshifting, it is still adding light from outside the atmospheric conditions, The Sun is still adding intense light and is creating a blueshift in the redshift because of it - but this is purely only from our perspective and not in reality.


Blueshift

from yellow light


Is an::

*  increase in energy

*  decrease in wavelength

*  increase in frequency


Colours vary from

*  yellow to blue (Visual)

*  blue to ultra violet


Then we are into the invisible:

*  Xrays to gamma rays

Blueshift

Redshift

Redshift

from yellow light


Is an::

*  decrease in energy

*  increase in wavelength

*  decrease in frequency


Colours vary from

*  yellow to red (Visual)

*  red to infra red


Then we are into the invisible:

*  radio waves to ELF / ULF

Photographers are taught to think in greyscale, in shades of grey.  For a professional photographer colour is secondary.  We have all, at one time or another, especially at school, had to sit through flash bulbs, reflective umbrellas and white reflective surrounds.  There is all this importance on lights and lighting, light metres and exposure,


It looks professional and creates wonderful photographs of you, but it is a very controlled and unnatural environment; and a good photographer is one who can make you look better than what you thought you looked.


The articial setup of many professional photographers doesn't allow for life's colour to be expressed.  I do not use a flash, and I don't adjust the exposure.  I want to take a photograph of the light, and even 'over exposed' ones are giving accurate information.  Because in photonic value there is no such thing as over exposure, only the recording of the light present.  For a professional photographer, they aren't studying photons, they want to take pictures that are pretty and will make money. 


There is no grey in the colour spectrum or the light spectrum or the colours of the universe.  There is only light, or photons, interacting.  All matter consists of photons, even you. Plants appear the colour of green, because it is the only frequency it doesn't absorb but reflects back.  That's why plants look the colour that they do, because of light reflection.  Everything, likewise, appears visually to us because of light reflection.


In my photographic experiments with light I also photographed artificial lights.  One such night I was photographing some temporary traffic lights set up at the bridgeworks down from me. The camera had no problem seeing the red and amber colours, but it had varying problems with the green light.  At times when the lights changed from red to green in a video, the camera couldn't see the green.  The same thing would happen at times when photography a changing red to green light.


Was it the quick change from one spectrum to the other that made the camera go blind, or was it the green itself?  To date I am unsure, but I am more inclined to think it is the green, the frequency itself.

.


Another phenomenon associated with our perception, is the Rayleigh Scattering.  After periods of time observing redshifts in the photos of the Moon, during rising and setting, I concluded there was some sort of effect created, that redshifted the natural moonlight.  This, as I discovered was correct, and is known as the Rayleigh Scattering It is also the reason why we can have beautiful and colourful sunrises and sunsets.  It all has to do with the reducing light, and the angle of our perception.


You will find much information to do with light comes back to our perception.  The 3 effects I have already mentioned are all to do with our perception and how light appears to shift, or scatter - blueshift, redshift and the Rayleigh scattering.


All these things are easy enough for any person to observe.  However, many effects are subtle and unseen to the naked eye, but the range of your camera can detect what you can't see.  Blueshift and blue clouds at night are easy to photograph with the Moon and a little bit of extra light, which can come from the moonlight bouncing around and reflecting in the clouds.  You can't visually detect this subtle change, but your camera does.  Through a series of photos you can have blues to pinks and purples.   

So far everything I have shared with you has come from using my camera as a scientific tool to study light.  I asked the question - what can photographs of the night sky tell me, about what things are and how the light can show me these things?  Starlight, moonlight and sunlight became my study.  What could the photons, or light tell me?  I also studied artificial lights, from house lights to street lights, safety lights, fire engine lights, police lights, traffic lights (portable and stationary), Christmas lights, light reflected off water, light through trees or other interference, light in thick cloud and thin - at night and by day.


My camera isn't a professional grade.  I went to a camera shop and said give me the best camera you have that I can just point and take a picture.  I didn't want to have to play with all the knobs and focusing.  After paying over $1000, I had the fastest memory card available, that records information at 4x the ordinary pixelation, and had amazing zoom and other features.  Of course, a good tripod was a must.  When you are photographing stars you need precision and stability.  It is impossible to get a proper picture of a star without one.  


To verify my information I also bought a smaller camera, with less capacity, to compare the end result.  The results were marginal enough to state that anyone with a camera, and it doesn't have to be top of the line, even a mobile phone, can reproduce the results I have already mentioned, and will also describe below.  The only issue is stability and no matter what camera you have, if it's not on a tripod or fixed from moving somehow, your images will not produce an accurate still image.


The point I am making here is, you don't need a fancy camera or a telescope, or a fancy degree.  You just need curiosity, a steady camera, and a few handy hints.  What you discover in the course of your research leads you into many scientific fields seeking understanding of what you are looking at.  Many of the things you find confirm your own observations and theory.  As you progress, and realise that you too can make these observations and verify them with repeatable tests, then you don't need NASA's fancy equipment or telescopes to find out many things for yourself.


In this study you will also find that some things that are conventionally accepted and understood, contradict your own observations, and this is where I started a list of anomalies.  If there is no known science or theory, to describe what I am observing, then how am I to know what it is?  This little puzzle is the subject of another article, in which the anomalies I have encountered will be described.


NASA and other space agencies use this basic information to interpret the composition of stars and galaxies.  Space agencies equipment is designed to measure and capture frequencies that exist outside of our normal sight and hearing as well.


A study into light and sound spectrum analysis and what those frequencies represent is quite intense and for the most part lacks an official descriptions list.  This information is not freely available and seems exclusive to University courses.


However, many people, through other means, have been able to determine many physical and healing qualities of many frequencies.  Information that has come from their own scientific experiments and results.  This will be another article.  

It is important to know your camera, what it's limitations are, and what it is using to look at what it's photographing.  Modern cameras are digital and are programmed with colour recognition, and many other effects.  Any program is only as good as what it was instructed to do, so know a bit about the cameras software capacity. 


What I can tell you about my camera is it is a Nikon PS510.  It has 6 lenses that will step out to 59mm and zoom in at that point.  At the same time I take a photograph the camera shoots out an infrared beam to the focal point.  This results in the camera being able to see and record objects that are revealed by the infrared beam, to the camera..  The camera then translates this into a colour.  What this means is, my camera can see things in a spectrum I can't.  I can take a picture of what I think I see, but the camera can see so much more.  The resulting images have amazing zoom ability without loosing quality, also revealing things that when not zoomed in are barely visual.


I will show you some examples of a crop of a zoomed in on area, which if you were to just look at the picture, you couldn't see it.  When you zoom in, you find amazing details that the camera has recorded.  To display the images in full depth requires a large screen, as this will bring out the hidden details that a monitor screen is limited to.  When I first displayed my images on a big screen, I was amazed at the depth, the details, and all the activity the camera saw and recorded. 

In the original image this is not visual without zooming

after cropping this is more visual  

cropped and zoomed in on even further

The above 2 images would however be clear and evident in the image on a large screen TV.  And it is really the only way to appreciate everything in the image, as an overall image.  I am currently limited to a monitor screen and zooming in on specific areas.


Some images are good to use auto correct on, as it brings out the whole image more.  It's crude, but effective, and allows for an easier look at the overall picture without a big screen.  All my photos are originals, and any crops or auto corrects are done on copies of the original.  Some images seem just black, but when you zoom in or autocorrect, it's another story.  Below is an example of the difference auto-correct can make to what you are seeing.  

From the above photo comparison with auto-correct you can see how much 'darkness' can veil the incoming light.  In a way it is the ultimate in poor lighting, and once you remove the poor lighting aspect, you reveal what was seen in more clearer detail.  It's an interesting exercise, but I don't use these images to show others, as they are not the original data, and there is a margin for error and too much room for conjecture that the auto-correct just 'made the image up', which it didn't.  It is merely lifting the veil of poor lighting to reveal what I believe to be a more accurate image of what the camera saw at the time.  But due to the margin for error, my study and research only involves the original photos and what is in them; as these can be presented in original and unmodified form to verify the data to anyone who challenges it.


I have had my camera for 4 yours now, and in the first 18 months I spent most nights outside, from dusk till dawn, photographing the night sky and what was in it, exploring the questions and observations.  During that time, and after, I took many sequenced photo's one after the other, in an attempt to capture orbital qualities of the stars.  I learnt many things from this, and one of the most important things was the orbital patterns and how important the angle of the star to centre is, compared to which star you are photographing an what other objects are close by.  The camera sees 2 orbital elements, and only at the right angle will it show a detailed analysis.  More on that further below.


When I take a photograph I can see many things while the camera is taking the photo, including the centre point of focus. This doesn't come up in the image, but only shows as the image is being taken, as to it's relative position to centre. From this I can tell where the star is located in the image, compared to centre. This is important when photographing stars, as angle to centre is often an issue with obtaining successful images. I will explain more on this, angles, and moving stars further below.


My camera also has video capacity and due to the large size of my memory card I am able to record long periods of time and orbital movements.  The video has different capacities to a photograph.  For example, a photograph may say it took a fraction of a second to take the photograph, but the image itself takes around 5 seconds to record all the data.  The photo is more detailed, but it does create gaps, even in sequential data.  During those 5 seconds I could see an 'anomaly' flash, but the photo is already being created and the camera wasn't looking.  This way you can miss capturing many things you can see, but your timing with the camera is off.


The video doesn't have that problem, and it captures those flashes or anomalies.  The video can them be converted into still images, at the rate of 29 images per second of video, where those quick flashes can then be identified and observed.  The zoom quality of these still images is more limited in zoom capabilities than photos, but can still offer enough to identify 'anomalies'.


The video is however more limited in its field of vision, in that it views a smaller area than what you would see if you took a photo.  I don't know why this is so, it's just the way it is.  It may have something to do with the fact my camera had a wide lens, and perhaps it doesn't use this in video???


Another limitation of the video is that is doesn't shoot off the infra red beam like it does with photos: so the video is not recording particles, or photons, that are in the infra red range like the photos.


A video will also only record the light it sees at that moment, which is 1/29th of a second capacity and due to this very little weaker light is captured and overall the video's are much darker.  A photo taken at the same time, say around twilight, would photograph more light than what you can actually see with your own eyes.  It is considerably lighter.  It's because it represents what is sees over a period of time, and that might only be a few split seconds, but it is still enough time to record much more data, and light.


looks black, but if you look closely

you will notice small dots of colour

the same image auto-corrected

background radiation is revealead

The white out effect above may not be a very good image of lightning, in fact it's terrible, but underneath, you will notice the lights on in a house, because it's at night and it's dark - but looking at everything on the hill, even how dry the grass is, the colour of the roofs and buildings, the green in the trees, and a blue sky.  It makes for a rotten photo of lightning, but an accurate photo of who much energy is exploding into an area when lightning flashes.  And if you can see the energy, then it is likewise 'lighting' you up.  Storms are highly energized times.  One thing you can't see or hear though, is the thunder, and that is the beauty of videoing lightning storms, because the video has sound as well.  This again is more frequency, more photonic interaction.


Now lets have a look at 1/29th of a second with a video still image.  One thing I discovered from doing this with lightning videos, quite often there are split images - on 1 image is the top half, and in the next 1/29th of a second it is the bottom half.  It's quite odd, and not something I have seen before.  Probably because most lightning photos that circulate the net are professionally done, with better equipment than I have.  They have time delay, I don't.

Each image of the lightning above was produced by me just pointing the camera in a direction, and pushing record on the video.  It was the only way I ended up with anything other than many black shots, and the odd white out.  Each image represents only a fraction of a second, 1/29th of a second.  It is like slowing down time to see what happens, and in a video you can do just that.  You can see the 29 images that make up that second..  I use a reliable and free program to do this, and anyone can do it.


Notice the level of colour.  All this colour is showing massive energetic reactions.  But what is noticeable about lightning today, is the direction.  Mostly I remember lightning coming from the sky and down, earthing out.  Today lightning isn't earthing out, it goes sideways, and this in itself is not typical.  The lightning earths out, not because is wants to, but because it is drawn to the energy of the earth.  If it is not being drawn down and earthing out, then what other force is disrupting this natural process - what is 'pulling' or drawing lightning sideways?  A curious discussion for another article, for the time being we have enough to still cover when it comes to photos, lens effects and how to know for yourself what you are looking at in a photo.  At a later date as I complete these other articles I will update this page with links, rather than promises.


I have covered some basics about what a photograph is, on what your camera can do and knowing it's strengths and weaknesses.  If you have been following, then you will probably see a pattern emerging - and that is that your camera can see things in different ways, depending on circumstances, interference, timing and light shifting.  At times it can see much more than you, and at others it can see less (as in videos at twilight, I can see more light with my naked eye than what the video records, and they are always darker than what is visual).


Now it's time to explore angle and the centre of focus, and the problems associated with photographing stars and planets, that are unique to each one, and is related to what it comprises of as a system; or as in our planets, what moons are positioned where, and they change from night to night.  The angle you use one night won't work the next, because the positions of the moon have changed, and all the orbital elements that make up that planetary mass need to be angled just right, to capture the finer details.  So let me explain this a bit more.


Before looking at the perspective of the camera, and how it see's thing differently at different angles, we need to examine an image and establish the geometric factors associated with the camera lens, which is concave, set in a round casing.  Light refraction and angle are important to understanding your photos.  But I am not going to get technical, but show you visually what I mean - this I think you will understand better than words.

The above image is an overlay of lines onto a photo, to show the centre of focus, and the various sectors that effect angle, especially when you are zoomed right in on something.  The stars in this image is of 3 stars in the handle of the Iron Pot.  This is a term used in the southern hemisphere to describe how we see this section of space.  It is upside down compared to the northern hemisphere.  In the northern hemisphere this is associated with Orion's belt.  We call it the Iron Pot, because it forms the shape of a pot with a handle, and these 3 stars are the handle.  The handle isn't Orion's belt, but actually what is below his belt.  What the northern hemisphere sees as Orion's belt, the southern hemisphere sees it as the base of the Iron Pot,


But as you can see, when you zoom in, and at the right angle, you see there are more than 3.  The middle star is surrounded by a white haze, and what you are looking at are several nebulas.

What you are looking at above is a crop of the Iron Pot handles middle star from the previous image.  As you can see, it's more than one star, plus the cloudy white haze actually represents 3 nebulas.  Yes, that's right, you can photograph nebula's for yourself, with just a camera.  And to know what you are looking at, in real-time from your location, I use another reliable and free program Stellarium.  There are also updated star catalog's you can download free and add to the program.  It will show the positions of planets, stars, display constellational information and images, satellite positions, meteor shows, it has many features.

To get an idea of the complex nature of lining up all the particular little stars I have produced a grid below, with overlaying geometry, to further show angle. The larger oval is the boundary of the lens, and mostly you can see the star greatly contort from it's round shape to almost half a circle. At this angle it appears you are starting to look at it turned sideways, and much skinner, although that is only because of the lens..

Then auto-correct image and cropped further.

In the auto-correct version you can see pinkish tinges , and more definition of what is in the nebulas.  Coincidentally, the Great Orion Nebula that you are looking at, in the top part of that cloud, is pink.  Coming down into the De Mairans Nebula  and at the base, to the right is the Running Man Nebula.  For the most part, we see this as just one star - amazing what a camera can see.  What is also interesting to note is that these weren't even images taken at full focal length and zoom.  To get the 3 stars in the one photo I couldn't zoom in too far.  So imagine the detail zoomed in.



The above grid may look strange to you know, but hopefully by the time I have finished explaining, you will see how it relates to angle, perception, and what you can see.  What I see in the image is basically the locations to try to get the right angle.  When you are zoomed in on a star and take a sequence of shots, if you have lined your angle up right, at certain places as it orbits through it will snap that greater detail, because at that particular angle it has a really good view.  What is helpful is the centre point showing up as the photo is being taken, and also any nearby stars or planets of theirs flash their position.  If both these objects aren't angled correctly, you won't get the right view.


What view will you get then?  I have come to call this other view an orbital mass, because that's what it represents.  At certain angles the camera can see all the different components around the star, but at others it can only see it as one star.  The best example I can give you is the Pleiades.  It seems small and faint, but the small cluster of stars is quite vast.  It is called the 7 sisters, because that's how many stars we can see.  Like the handle of the Iron Pot, to get all the stars in I can zoom in all the way, so again these images are not at full zoom and much more can be seen when zoomed in.

In this crop directly above, we are missing at least one of the sister, because I cropped it off to bring out this section.   But even in the one above this you can see, there are many more stars than just 7.  But what makes it interesting is taking the same images above and auto-correcting them, and see what they show.  Shown in the same order as the ones above, the 3 below have been auto-corrected., bringing out some interesting things.  

Our 7 sisters have turned into 14.  It isn't so easy to see the fainter ones, so I auto-corrected again, to highlight these more - it degrades the quality of other things, but brings all the elements at play into better vision.

In the above image you should be able to count the 14 circles.  But I have the added advantage of knowing there's more that can't be seen, because when I zoom I can see them.  So in order to bring out yet more to show you I auto-corrected again.

As you can see, the above image is now becoming quite scrappy, but revealed in it's depths are more circular orbital masses - if you are struggling to see them , I have circled them below, and even still it might be hard to see them.  There are possibly more, but I am unsure of those so I didn't circle them.

So how many sisters do we have now (with one I should have marked not included) - how many orbital masses - 20, possibly 21 or 22.  The family of the 7 sisters has certainly grown.  So let's look at the right angled one that we saw above, where the 7 sisters became a system, not just stars.  What can auto-correcting this reveal?  Remember these are taken seconds after each other and are in the same relative position, but just slightly moved into a better angle.  I auto-corrected the below image 3 times, and with these ones all it does is make the stars brighter.  Again it's to do with angle and how there is less interference.

There are to many to accurately count, but the main ones we can see are Suns, or extremely large planets, or nebulas.  As you can see, the image has not become 'scrappy' like the last auto-correct.  The 7 sisters is far from an accurate description of the Pleiades.  As you can see now, more clearly the orbital masses I circled, and you should be able to see the ones I couldn't determine from the other photo - with 2 small stars close together, almost in the centre of a square formed by 4 bright stars.


One of those white dots is Alcyone, the central Sun for the Pleiades - it's the biggest one there.  Around it are other suns, of other systems, all part of the Pleiades constellation.  And from what I understand, we are also part of that constellation, and as such our solar system revolves around the central sun of the Pleiades, Alcyone - or more correctly our solar system is gravitationally attracted to Alcyone.


In the next image I cropped the image above, that chopped off a sister or two, and auto-corrected 3 times..

And guess what - I didn't need NASA for any of it.  I didn't have to rely on their disclosure, nor question what they do say, I can see so many things for myself, as you can too - as anybody can.  Are there 7 sisters - if seeing is believing then you have to say NO.  We can only see a small portion of what is really there.

At times the camera only sees one object, even though it sees the elements around it, it lumps it all together in one orbital mass.  Hence what you think is one object more represents a system within orbit of the main star it can see (which as said above, is most likely a sun and it's own solar system).  Whilst the camera lumps the image into one, it does recognize something is around it, and this is shown as rings in the orbital mass. 


It was a theory of mine that the number of rings indicated the number of objects, but so far the results have been non conclusive - with a 50/50 rate of determining the objects correctly.  I can know this because of the moons of the planets I can photograph, and the number of ring elements I can see in the orbital mass.  It is possible that 2 nearby objects are represented in one ring, but this is only guesswork and has no substantiated, repeatable data to back it up.


My conclusion over the inconclusive orbital masses rings is yet again an issue of angle and being able to get that right angle.  In a series of photographs I start with the star in the corner so I can photograph it as it progresses.  Most of the images will be of orbital masses, and it can sometimes be very difficult to get that one right angle that shows what's really there..

Wikipedia has a far more technical explanation 

A photon is much more than light, and would be best described as 'information'.  There is light and sound we see and hear, then there is a huge range outside of that.  Photons cover every range of existence.  You even are composed of photons, photons are in your DNA - photons are in all things.


The photons that I am discussing exist only in a small range of 'what is'.  These photons are represented by colours, and in scientific terms this usually relates to the coloured energy they can be observed having.


As all things cycle, our solar system is moving past a photon band, decades wide.  This is causing extra photons to enter the Earth as well, and even to within your own DNA.  On a cosmic scale, our solar system is being flooded with photons from the photon belt, and this is bring more light - both literally and spiritually.


As I said, photons are 'information', and this extra light is having a real and physical effect on all life in our solar system.  Can you tell from a graph of photons, or a photograph, what is going on in space?  Yes and no - yes because you can see and discover so many things for yourself, and no, because we are all limited by the equipment we use.


Every photon reaction and subsequent building of matter into a myriad of things, are quantum particles of certain wavelengths and frequencies, and photons are the real building blocks of matter, not atoms.  Photons are what create the atoms and it's sub-atomic parts of protons, electrons and neutrons.  Photons in a visual frequency range, seen as light, are the closest to the God particle we will be able to see with our human eyes.



When you are looking at a photo your are looking at more than just an image of what was photographed, what you are actually looking at is the camera's representation as a graph, of the photons it sees, or the light as we commonly understand it.


What a photograph is, is contained within it's name and very simple - it's a graph of photons (light).  If a photograph is a graph of photons, then what is a photon?

Photons

What does colour mean in a photograph ?

What does colour mean in a photograph, and how does it tell us what is really going on?  Perhaps the best example I can think of at the moment is fire.  We all know fire and what colour it is, orange, red, yellow, all those reddish/orange colours.


We also know that the fire is hot and can burn.  We see red as a warning sign, and believe it appropriately warns of danger, and hot things.  We associate heat and fire with red and orange, and our danger signs are colour coded because we perceive red is the level of threat we need to access for safety purposes.


But we also know that a blue flame is much hotter, and also takes special components for it to achieve being so incredibly hot.  This is excessively hot, as even lava is still red/orange, and you would melt if you fell in that.



Photonic wavelength activity                                    creates atomic structure

Why don't we associate blue with hot, if it is far hotter that a red flame?  Why is red the colour of caution, when blue is the more threatening?  One of the many contradictions we are taught.


The light coming off the blue flame a welder uses can damage your eyes.  Why is that?  


If you look at the colour chart you will see there is a very clear reason why.  Red has less energy and longer wavelengths, blue has more energy and closer together wavelengths.  Blue is a higher wavelength and frequency, whereas red is lower.

What photographers call over exposure is in reality lots of light, photons.  We refer to light as being white, yet light isn't any colour from our perspective, it only puts light on things so we can see them - and then we can see the colour, because of the light and it's not dark and you can see.


Most people are taking photo's to have pretty pictures.  I take them to study the photons.  My camera is a research tool, not a novelty.  Within a short amount of time I was able to determine 3 basic types of stars, based on their colours and the energy surrounding the 'white' light we see as the star.  Those 3 are blue, orange, and the bright and colourful 'multi-coloured'.


The blue ones, like Spica, Rigel, Sirius, have very distinct blue energy aura's around the 'white' of their core.  They are very easy to photograph and to capture on video.  The orange ones, like Arcturus, Betelgeuse, Mars, are very difficult to capture on video, and problematic to photograph.  Orange is a lower frequency and more difficult to pick up.  


At full zoom on my camera I have a close up view of the star and one slight movement and it's out of the frame.  Precise fine tuning is required to keep moving the star back into the frame, as it is constantly moving.  At certain angles the camera won't see it properly and will represent the star as what I have come to call its 'orbital mass' (because that's what it represents - more on that further on).  This orbital mass looks much like a colourful spinning disk, or orb.  Depending on which star it is, it can be blue and easy to see, or orange and very difficult.  


The 'multi coloured' ones have very bright light but couldn't be described as blue.  When you look at the orbital mass through the camera lens there are all these bright and different colours, like a silvery rainbow.  Very pretty.  Vega, Capella and Canopus are ones like this.


The very same colour you get in a photograph of a star, is the same know information about that star.  I use a very good free program, Stellarium, which includes star data, such as spectral analysis - and when it comes to stars and what they are, it's all about spectral analysis.  As science progressed they found ways to measure other frequencies, such as gamma rays and other 'non visible' photons.  But at a very rudimentary level you can make your own discoveries about the stars and what's around them, just by the colour, and what might be interfering with that or perhaps even objects outlined by it's light.  I will discuss these things further on. 



How can photon frequencies be represented ?

Here is the very basic representation of how,

as yellow light increases energy it blueshifts, or as it decreases energy it redshifts.   



When yellow light blueshifts or redshifts

You can be your own scientist

Professional photographers


grey-scale and artificial lighting


What your camera sees, and how it sees it


instead of brightness correction, it 'darkness' adjusted 

However, the videos capacity to capture the 1/29th of a second is the only way I have successfully captured lightning. A photo takes in too much light at once and destroys the shape of the lightning bolt and creates a massive 'white out'. I suppose I could adjust the exposure, but then it is also a matter of timing. I learnt much about the rumbling of thunder to tell when the flash was coming and I might push it just as it happens, but I am that split second off and I miss it completely, or rarely I would get a white out. So some things are best examined by video, especially lightning. My camera doesn't have the capacity to do time exposure either, so setting it to wait and then record these flashes isn't within my camera's capacity. You have to work with the tools you have, and the video works fine enough for comparative purposes. 

Lightning


Angle and Perspective


The above image isn't always easy to get, and it's because of the angle, and the smaller stars need to be in certain positions relative to the angle of the camera lens, to be able to see all of its elements - otherwise you get what I call an orbital mass, but I will discuss that shortly.  Before I get back on topic of angles and perspective, I would like to show you what the above image looks like, it you remove the veil of darkness more - and introduces more texture and colour.

The area of the nebula's is exhibiting colour.

So I think we can all count and see there are 7 stars.  But what you think are 7 stars is in reality more.  At the right angle, the camera can see what makes up these 7 lights we see and call stars - and it likewise corresponds with Stellarium star charts and the detailed stars of the constellation.  So home many Sisters are their?  If I crop this next image you won't see all the stars, so I'll show you the overall image, then a crop of the main area.


What the Pleiades really looks like


How many Sisters are in the Pleiades ? 7 did you say ?

Orbital Masses - what do they represent ?

Temporary stop:  last added to 17 December 2016

Subjects still to cover include: Orbital masses; angled orbits and perspectives of looking south or north, and things rising or setting - following a sequence of shots; northern and southern hemisphere perspectives; where lens effects from the sun will end up in your photo, and how many depending on the zoom; light and interference, like fog and trees; light reflection of the moon of still water (can produce indistinguishable image to real one); planets and their moons and how it affects from night to night the angle you need to get the right shots; lens effects from the Sun, Moon, Venus and artificial light sources; and more.


Stay tuned - more is coming!