# Thread: Creating an Analog Model of a Galaxy

1. ## Creating an Analog Model of a Galaxy

I have been a programmer for a number of years, and only recently delved into physics and astronomy. I was surprised to see this April 3 article from space.com, which discusses some results from a supercomputer simulation where "scientists used powerful software to model the formation of stand-alone disk galaxies and follow up to 100 million hypothetical stellar particles being tugged at by gravity and other astrophysical forces."

Why the surprise? In my spare time over the previous week (just prior to the article release) I had written a PC-based simulation of my own to analyze galactic structure and formation, and this article sounded like a very similar model, though much further developed, and probably far better-funded. Granted, I have only scaled my model up to 8 million particle points, but the operation is entirely scalable, limited only by the memory of my PC and limits of the programming language (VBA), plus a degree of patience.

My initial model only used mass and location to analyze gravity intensity/acceleration by calculating the effect from every particle in a sphere of consistent density on varying locations inside and outside the sphere. I set up the analog sphere as a 3-D array representing the three spatial dimensions. The content of each element being the 'mass' of the particle at that location.

The reason I used a sphere was so I could validate my math and logic. Gravitational effects for a sphere conform to some fairly simple math formulas. Testing amounted to running my particle-by-particle calculations, adding the results, and comparing that to the simple formula. After a couple corrections, the results of my calculations matched the formulas.

The second phase of the model squashed the sphere into a flat disk shape, so that density was highest at the center, and dropped off to zero at the edges. I could then calculate the net acceleration due to gravity at any location, but I chose to work just in the plane of the disk. I am still puzzling over the results. They are quite different from the sphere model.

I tested the model using a sphere diameter of 40, requiring a matrix of 64,000 points (40 X 40 X 40). I later expanded the model to a diameter of 200, or 8,000,000 points and a sampling of 500 distances, making for over 4 billion calculations. It's amazing what a PC can accomplish while I sleep.

The newest version is an attempt to add directional vectors for each particle, so the model can generate meaningful time-lapse type pictures of motion, or simply show the results of what happens after a longer period of time.

I'll go back to the spherical model as a starting point. Instead of simply calculating the acceleration at various locations, I want to calculate the effect of every particle on every other particle. A million particles is a million times a million calculations for each time period. Maybe my matrix is too large for a PC to handle in a reasonable time frame. The 64,000-particle model will do for starters. 4 billion (64K * 64K) calculations per period is somewhat more manageable.

I'm considered whether or not I should change the 3-D array to a 4-D array with time as the fourth dimension, but I really only need to know how the current matrix maps onto the next. Once the next generation is built, I no longer have any use for the previous matrix. Funny how much like time that really works.

I'll also want to add collisions in the mix, which could have varying levels of stickiness - depending on velocities and other factors. I'm not certain if charge would be significant as well, but I'll keep that in mind.

The characteristics of each particle are limited only by creativity. Eight million calculations for each sample location is a lot, so I have to keep in mind that each tested location requires 8 million calculations for each new variable. Four variables, and the results take four times as long to process.

I'm still working on the display feature, but right now, I plan to use color and intensity to show what is where visually. Since it's a 3-D data model on a 2-D display, it tends to just look washed out because each pixel represents many particles. Any helpful suggestions would be appreciated. Maybe show depth via a line of pixels, or a hexagon. Also, since I'm thinking about it now, I need to add a save and restart capability, as I may want to run the model for as long as weeks at a time. Ever have that disappointment of a glitch after hours of processing time and knowing you have to start all over from the beginning?

Other future enhancements I'd like to try are involving an inflow of particles, perhaps somewhat off-center to the galactic sphere. Or, how about two passing streams of particles on non-parallel trajectories and no initial galactic sphere. Could that generate a spinning galaxy at the vortex? Time - and patience - will tell.  Reply With Quote

2. Has there been any update to your program since you posted this? I'm a big fan of programming simulations and hopefully will delve into a gravity simulator at some point.  Reply With Quote

3. Banned
Join Date
Feb 2005
Posts
12,154
Same here--analog fan too  Reply With Quote

4. Originally Posted by mkline55 I have been a programmer for a number of years, and only recently delved into physics and astronomy. I was surprised to see this April 3 article from space.com, which discusses some results from a supercomputer simulation where "scientists used powerful software to model the formation of stand-alone disk galaxies and follow up to 100 million hypothetical stellar particles being tugged at by gravity and other astrophysical forces."
Does your program plot the photons that would be received by an observer from each rotating source or does it just plot the location of the source as calculated in your/their simulation over time?

The following image shows what the light paths emitted from 2 rotating sources would look like at t0 after one complete cycle in 3D Euclidian space. If you discretise the photon quanta and scale the time increments accordingly you could create a real time simulation of what someone who was observing rotating sources would see on the scales shown (but you might need a bit more storage and processing power ) while also keeping c constant.   Reply With Quote

5. Once you discretise the (emitted) photon quanta and accelerate the frame rate (time) you should note that the observation period in the simulated model must increases in proportion to this acceleration so that c remains consistent for the emitted photons being observed. The next image shows how the photon path quanta can be generated over 4 quarters of rotation (basically 4 quanta with duration Pi * R * (c/v)/2). The undistorted distance traveled by a photon emitted from point 1,0 that travels directly to the observer at point 1, 4 over one complete rotation equals 2 * Pi * R * (c/v) where v is the rotational velocity of the source around its center of mass and R is the radius of rotation of the source measured in light years.

If you take a photo of someone around 6 feet away who is waving a sparkler in a circle 2 feet in diameter and your exposure time equals the time it takes for one rotation you will see a circle in your photo. If you decide to perform this same experiment on galactic (as per the article) scales what would you expect to see at the observer if the observation period equaled the rotation period? How much further would the first emitted photons travel over one complete rotation if the source was moving at 0.5c, 0.1c, 0.01c etc, c/v (i.e. 2, 10, 100) times the distance it would have traveled (2 * Pi * R) if the angular velocity was c for one complete rotation? Last edited by LaurieAG; 2014-Jun-10 at 11:56 AM.  Reply With Quote

6. Established Member
Join Date
Feb 2015
Posts
125
mkline, does your simulator make use of Parallelogram Law summations?  Reply With Quote

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•