Thursday, December 13, 2018

GIS 5990 - Special Topics in GIS Archaeology - Module 10



The focus of this week was to create sherable data for all the layouts generated in the previous week. Above is an example of all layers exported in KMZ format, as seen on Google Earth pro. All dta was exported and placed on line.

The layers included are:

The park boundary
The Maritime Heritage Trail
The Reclassified Bathymetry Layer
The Reclassified Benthic Layer
The Clipped DEM layer


GIS 5990 - Special Topics in GIS Archaeology - Module 9


The series of maps seen above shows the surrounding landscape classifications for all five known sites of the maritime heritage trail. This information was used to determine crucial geographical factors that may be directly linked to shipwrecks.


The two charts presented above show reclassified benthic and bathymetric layers of Biscayne Bay. This information was used to determine the likelihood of shipwreck presence, based on information about known shipwrecks.


The final layout represents a weighted overlay generated by combining the bathymetric reclassification with the benthic reclassification. The only areas shown above are the overlap regions, with influence from both rasters.

GIS 5990 - Special Topics in GIS Archaeology - Module 8



The above layouts show the ENC chart data, the 1863 Nautical Chart and the Bathymetric raster clipped to the boundary of Biscayne Bay in Florida. The information will be used over the next two weeks to perform a marine archaeology study to predict potential shipwreck locations within the boundaries of the park.

GIS 5990 - Special Topics in GIS Archaeology - Module 7


Over the past several weeks, I have worked on a project which involved a study on detecting Scythian burial mounds, in the general area of Tuekta, Russia. The ultimate goal of the study was to generate a raster file, containing a random distribution of points which would express the possibility of presence of a Scythian burial mound, based on previously known information.

The metrics considered for a possible presence of a burial mound in the immediate area of the points were: slope, aspect and elevation. All of these layers were created, and generated in ArcMap based on DEMs downloaded from the web. The study itself was confined to a general greater, area of Tuekta, Russia, and the random points of the predictive model were merged with previously marked locations of known burial mounds, in order to have the spatial data of the known locations influence the appearance of similarly-set points.

The results of the study are represented based on an OLS model, which was created after the merging of the point files. With zero being an average value, the coloring of the points will tell us not only how compatible (or incompatible) the location of the given point is with a possible burial mound, but also express the degree of confidence in the result based on the data provided. The spatial autocorrolation test that was ran on the data showed the z-score (indication of the normal distribution of data) of 14.36 and the p-value (the likelihood that the data is not randomly distributed) at 0.0.

There are factors in the predictive model, which were not considered for this study but may be of importance to its overall accuracy. These factors may include the proximity to water, the landscaping of the landmass, proximity to other sites, and maybe even the overall space around the point. (The known burial mound locations all occur close to one another, in a very large cluster.) The model does however provide locations of potential locations based on the exact coordinates of each point. Each positive hit could benefit from a field survey, with all other factors neutral.

GIS 5990 - Special Topics in GIS Archaeology - Module 6

The series of maps provided represent a set of data gathered in preparation for a final study on Scythian mound location distribution. Four out of five layers shown above are generated directly from a Digital Elevation raster downloaded from Earth Explorer: the Slope, the Aspect, the Elevation and the Contour.

The last map shows a point shapefile created based on known mound locations, visible on basic aerial imagery.

GIS 5990 - Special Topics in GIS Archaeology - Module 4


The map shown above shows the results of a predictive model done on the general area of the Tangle Lakes Archaeological District in Alaska. The areas in red represent the highest potential for finding archaeological sites based primarily on the slope, aspect, elevation, proximity to water and the presence of pre-pleistocene ice caps.

The yellow and green areas represent moderate and low probability respectively.

Sunday, December 9, 2018

GIS 4944 - GIS Internship

To wrap up my Internship, I created a personal website portfolio which details some of my GIS related accomplishments to date. It contains my contact information, as well as my updated professional resume.




URL: https://amadeuszzajac1987.wixsite.com/gisportfolio


Excerpt from the Home page which highlights my career goals:

''It seems to me the nature of our existence is inseparable from physical phenomenon that we regularly encounter. In a way, everything that is of interest to human beings can be boiled down to physical things that surround us. Whether they be natural resources that we need for survival, or simply physical phenomenon of cultural or spiritual significance. Either way, an easy access to this knowledge is essential to fully understand the implications of our surroundings, as they are essential for our continual thriving as a civilization. My career goal is to contribute to revolutionizing the use of spatial data in all areas which aim to enhance human progress and understanding. Large and easily accessible databases would allow us, as a civilization to obtain spatial information faster and cheaper in all areas of work. Efficient use of geospatial data, and the creation of publicly accessible databases may ultimately cut down the need for time consuming, costly and sometimes dangerous process of physical exploration. GIS is just one of many, but nevertheless a great tool to use when striving for this goal.''

Thursday, October 18, 2018

GIS 5990 - Special Topics in GIS Archaeology - Module 5


This weeks lab assignment focused on gathering data for an upsoming analysis of Scythian burial mound locations. The data included a DEM elevation raster obtained from the earth explorer website (represented above as the large, colorful area), an aerial image of the nearby region for the purpose of detail, and a shapefile containing the location of the Tuekta site.

Saturday, October 13, 2018

GIS 5990 - Special Topics in GIS Archaeology - Module 3


This weeks lab focused primarily on creating interactive data based off what we had worked on for the past several modules, in a KMZ format. KML files can be viewed/opened on google earth, from which they can be exported into a google maps format.

Any user with viewing access can project the data on the go from any device with internet access (via google maps), and anyone with modify access can also change the data on the go.

Sunday, September 30, 2018

GIS 5990 - Special Topics in GIS Archaeology - Module 2

This weeks Module was a continuation of the first assignment.


Three rasters illustrating the
location of la Danta pyramid
in El Mirador-Rio Azul National Park.



-The first shows the NDVI raster, which shows the measure of
biomass within the general area. The darkest green implies the
thickest jungle, while the red implies a lack of.

-The second shows the Lyr_451 raster. This band better illustrates
recent stages of plan growth, or stress. The data may be thrown off
my recent rainfall.

-This is the final raster, which represents a supervised classification
of land into 9 categories. The categories were created based on
known data.

Thursday, September 13, 2018

GIS 5990 - Special Topics in GIS Archaeology - Module 1



1.     I began the module, by reading the introduction provided in the map assignment.
2.     I clicked on the earth explorer website link, where I saw that my account from the previous semester was already logged in.
3.     In the search criteria window, I changed my criteria to decimal degrees, and clicked on the blue Add Coordinates box where I copied the degrees provided. I saw that a marker appeared on the map, somewhere within Central America.
4.     I clicked the “Data Sets” button.
5.     In the data sets window, I navigated to “Landsat” and saw a fairly extensive set of sub-menus. I attempted to locate the mentioned ‘Landsat 7 ETM + SLC-on (1999-2003)’ option, but it was not there. Instead, I selected Landsat 7 ETM + C1 both Level 1 and 2. It was the closest name to the set mentioned in the lab.
6.     I viewed both sets of the data, and was not able to locate the specific aerial image for the date mentioned in the assignment. Because all of the Landsat 7 level 2 data required payment, I downloaded the most recent image from September 5th of 2018. I thought this would be the most appropriate, as the assignment mentions that the most recent data ought to be used in most cases.
7.     I downloaded the highest quality GeoTIFF file, and saved it to my raster folder. I named the file Landsat1Level1.
8.     The file downloaded in a strange .gz format. I had to unzip it twice, before I was able to see all the .tiff files contained within.
9.     I opened a new MXD file, and named it AZajacMaya_PyramidsWk1.mxd as the directions required. I saved the MXD to my MXD folder within Module 1.
10. I added the Imagery basemap to my blank MXD file, from the drop down Add Data menu.
11. I extended Arc Catalog, and connected to my Module 1 folder. There, I right clicked on my Shape sub-folder and selected Create new Personal geodatabase. I named the Geodatabase Maya_Pyramids according to the directions.
12.In the ArcMap main menu, I clicked on Windows and selected Image Analysis. A new window appeared, which I have tagged to the right side of my screen.
13. In the main menu, I clicked on the Customize Menu and selected Toolbars, then Image Classification. I placed the toolbar in a free space at the top.
14.I clicked the Add Data Button and navigated to the raster images that I downloaded beforehand. Because my Landsat 7 image was different, I obviously could not open the very same image that the instructions suggest. I did however select the 8th band and opened it. When asked if I wanted to build pyramids, I clicked yes.
15.I took a bit of time to examine the Landsat image, and to compare it to the Imagery basemap.
16.I tried it on 1:30,000, and 1:8,000 and tried to see the contrast. It was difficult however, to see detail on the black and white landsat image.
17. I went into the layer properties symbology tab, and clicked “Display Background Value 0” as no color. I have also noticed, that my Landsat band is visibly damaged, and contains missing patched throughout.
18. I added the Mirador point shapefile, and zoomed to it. I made it appear as a large yellow dot, and I took time to inspect what lies underneath. The mound was not very easy to see on the landsat aerial, but was very visible over the imagery basemap.
19. After inspecting the image, I followed the directions provided in the lab to pull up the Composite Bands tool in the Data Management folder. For the input files, and I navigated to my Raster folder and selected the first three bands, in the appropriate order. I ran the tool, and the newly created raster was added to my MXD. (the result was saved to my geodatabase prior.)
20. I entered the layer properties, and went into the display tab. There, I changed the band numbers next to the designated colors, (Red: 3 Green: 2 and Blue: 1) as they were indeed backwards. I have also selected to display the value of zero as no value at all. I pressed apply, then OK to display the image with the new the new settings.
21.  I tried to pan-sharpen the image, but was not able to do it. I made sure that my settings looked exactly as they did in the example provided in the lab and I was still unable to check the box. The problem was not that the option was not available, it seemed to be (the text next to the box appeared in black, as if it was clickable.) But it was simply not clickable. I have no idea what is going wrong here, but I was not able to pan-sharpen the image.

22. I zoomed my map to the Mirador point file.
23. I right clicked on my Nat_Clr layer in the Image Analysis window, and selected the Accelerate option.
24. I right clicked on the Nat_Clr layer in the Image Analysis window once more, and opened the options by clicking on the icon in the top left. I went into the NDVI tab, and made sure that red is set to 3 and infrared band is set to 4. It was.
25. I clicked over onto the Pan-Sharpening tab, and inspected the settings. They seemed to correspond to what was demonstrated in the lab.
26. I moved on to the orthorectify tab, and took a look at the settings. Without changing anything, I moved on according to the lab directions.
27. I did the same with the Hillshade tab, and after ensuring that the options are the same as shown in the example I moved on.
28. I moved on to the Stretch tab. I examined it, and moved on leaving the default settings.
29. I moved on to the Display settings. I tried to move the shade around to 45, then to -45. I shifted the Gamma settings back and forth and reset them to default at the end. I proceeded to inspect all the tools in the Image Analysis window as outlined in the directions.
30. I right clicked on my table of contents, and selected to add a group layer. I renamed it “Landsat”. I right clicked on the new layer, and added the 9 layer bands to it. The program asked me if I wanted to build pyramids each time it recognized a new band. I clicked ‘Yes’ each time, and eventually all black and white Landsat images appeared under the new group layer. I minimized all of them in order to be able to see the table of contents better.
31. I examined the table provided in the third part of the exercise, and pondered the wavelengths of each band.
32. In the image analysis toolbar, I selected bands 2, 3 and 4 by holding the shift key. Then, I pressed the layer compression button which became highlighted.
33. A new temporary layer was added to my table of contents, but the grass appeared blue in color. When I went into the layer properties, each band that appeared was named band one. Either way, I selected the third “band 1” for my red color, and the first for my blue color. It seemed to have worked just as well, and my raster gained the near-infrared appearance.
34. I renamed the new layer False_Color, and exported it to my database. The new exported layer was added to my map. I had to re-set the bands once more, but this time they were appropriately named Band 1, band 2 and band 3.
35. I went back to my Image Analysis toolbar, and attempted to shift the settings around to improve the image quality. After resetting everything back to zero, I clicked the DRA button to normalize the image at current extent. The option seemed not to help much.
36. I feel that I understand the processes explained in the lab, but I found difficulties in the simple facts that 1: I was unable to acquire the actual imagery needed for this lab, and had to work with a seemingly damaged raster that did not provide a full view of the landscape, and 2: My pan-sharpening option would not work, and I do not know why. Because of this, the pixels remained large and the image was very vague when zoomed in.
37.For next weeks lab, I intend to track down better imagery to work with and possibly make more compatible raster files.
38. I took time to create an aesthetically sound JPG, and exported the file.










At the very end, a similar layout was generated on Angkor Wat in Cambodia. 
I created a new MXD file, and accessed Earth Explorer to download Landsat data for the general Angkor Wat region. While I was able to use the image analysis window in ArcMap to combine all the appropriate bands, I found that the general look at the area did not yielded a whole lot of information to me. After zooming in closer, I saw that I was able to make out the canals associated with the site. This is where the False Color, and the SWIR layer came truly in handy.
It is difficult to estimate the size of the entire “island”, but it seems that the stone temples take up only about 20% of the site. Most of the area is still covered by jungle, and surrounded by water. Though there is quite a bit of anthropogenic features surrounding the entire island, much of it seems quite untouched.

Sunday, August 12, 2018

GIS5103 - GIS Programming - Final Project

                As I had previously mentioned in my proposal and presentation, the script I set out to create was designed mainly for archaeology contract work. Basically, it would allow the user to obtain quick geometry calculations pertaining to the amount of various disturbances within an archaeological study area (or APE). The script would work in several distinct steps, first by creating a clip polygon of all input polygons within the given APE, create an area column, calculate the acreage of the overlap for each intersection (while at the same time creating an acreage of an entire APE), printing the results, creating another column in each of the clipped polygons (and writing the total Study Area acreage within it) as well as a percentage of coverage column, performing a division calculation to determine the percentage of the APE that each clip occupies and finally printing the second set of results. If all went smoothly in PythonWin, my intention was to revisit the later chapters and create a toolbox which would allow the user to use the script as a tool with manually-defined parameters.
                The first step of operation was actually done outside of Python. I created a new MXD, imported a basemap and zoomed in on a random area in Pennsylvania. After setting the coordinate system of the MXD to ‘State Plane – Pennsylvania South US feet’ I began creating hypothetical survey data which I would use to test my script. I drew an APE polygon around an intersection, covering the area of several residential yards. Next I created three classes of disturbance which stretched outside of/and intersected with my study area. I called them ‘Wetland.shp’, ‘Excessive Slope.shp’, and ‘Modern Disturbance.shp’. (All shapefiles above mentioned were polygons. The Modern Disturbance file covered the asphalt road and well as all driveways and houses, the Wetland file contained two separate parts and the slope was made up of just one large area to the south. Once the shapefiles were finished, I exited arcmap. My intention was to turn it on and off while testing my script, in order to avoid errors when making edits to the same file on two different programs. The script I am creating is, as of now, limited to three separate disturbance inputs. This was done simply to conserve time in the beginning, but time-permitting I would be able to go back and add more if necessary.
                I began the script by simply importing all modules I thought may be necessary to run it. I imported arcpy, os, fileinput and string, and from arcpy I imported env. I set the workspace to my shapefile folder and enabled overwrite output. Within the same part of the script, I decided to define all of my most common and most necessary inputs. I defined my mxd file with a filepath, and the study area was defined shortened to “clip_features”. I also defined “clip_feature1”, “clip_feature2”, and “clip_feature3” as my disturbance shapefiles. Because my first operation would be to clip the three, I defined the path of the initial clipped polygons and shortened them to “clipped_polygon_1”, “clipped_polygon2”, and “clipped_polygon3”. I thought that the best way to troubleshoot the script, would be to print messages at the beginning of each portion of the script, as well as printing “complete” at the end of each operation. That way, whenever I would run the code I would know exactly how much of it worked flawlessly and what part needed fixing.
                The first actual operation performed by the script, would be to clip all of the disturbance shaefiles in order to isolate the overlap within the APE. I used the arcpy.Clip_analysis function and named each input feature to be clipped by the clip_feature. I did this three individual times for each of the polygons, but added an “if” clause for input 2 and 3. The ‘if’ condition was simply a presence of the given shapefile as it is defined. If it did not exist, or was not defined the script would print a blank space and move on. The idea was to allow the script to be run with less than three disturbance areas whenever necessary without showing any errors. (I applied the same formula in all of the following steps, when it came to performing operations on the disturbance polygons.) After the clip operation was completed, I checked my output folder in ArcCatalog to make sure that the new shapefiles were contained there. They were.
                Next, I had to add my area columns to the APE and the clipped disturbances. I began by defining local variables to make my work easier, and I defined fieldName1 as “Area” and fieldName2 as “APE_Area”. I added the “APE_Area” column to my APE shapefile using the arcpy.AddField_management function. The column created was a Double number with precision set to 15 and field length to 10. This was necessary when dealing with a decimal point, and because the areas we calculate at work are always in Acres, a decimal value makes a big difference. I repeated the step individually (as in the clipping step above) for my disturbance inputs, but instead of the “APE_Area” column I created the “Area” column. I made this distinction, because I would have to merge the two shapefiles later and could not have the same column name twice with different values. My logic was that this would make my life easier later, when calculating the percentages of the area is question in a later step. I ran my script to test it, and opened my mxd to see if the changes had been made. Thus far, it ran smoothly and all the necessary and the APE file as well as all the initial clips contained a new column within their attribute table.
                Within the same section of the script, I went back to perform area calculations. As I had stated before, at work we always deal with acres. I used the aprcpy.CalculateField_management() function on each of the polygons, populating the “APE_Area” column in the APE shapefile with its acreage, and the “Area” column for each of the disturbance polygons. I made sure that the operation is being done on the clipped polygons instead of the originals, and added each of the operations directly below the corresponding arcpy.AddField_management operations from the previous paragraph. I saved my script and ran it, then re-opened my mxd and went to check my results. Thus far, all seemed to be in place and the columns were populated by the acreage total with a decimal.
                The fourth step of the script was to simply print the area results. This would be the first part of the code that would allow the user to obtain new information about the data they are working with. In order to display the results in a user-friendly manner I used the search cursor function. For the APE, I defined the cursor to the “APE_Area” field, and printed a message joined by two string texts: "The APE has a total area of", (row), "acres." This way, python would print out the acreage of each separate polygon feature within a shapefile in question. For the next three input files, I redefined the cursor to focus on the “Area” field instead, in order to obtain the acreage of the clipped polygons. Because the names, and the order of disturbances would vary depending on project the message I printed was: "The first disturbance listed includes an area the size of", (row), "acres within the APE." I saved and ran my script, and all was well thus far. I was happy to see that both rows of the first disturbance were listed separately and contained correct, individual acreage.
                Next came the time to import the “APE_Area” field into my disturbance polygons. I had begun running into a bit of trouble here, as I had initially hoped that I would be able to perform these calculations by calling values from various polygons. For example, I wanted to input a simple equation where the value of the “Area” field for each of the polygons could be divided by the value of the “APE_Area” and multiplied by 100. I intended to use the result of this equation to populate a new field in the attribute table of the disturbance polygons in order to display the percentage of APE that is taken up by each of these features. But after a very long time on Esri forums, and many attempts at troubleshooting I saw that I was not able to figure out how to do this properly. It was then that I decided to use the Spatial Join function to create yet another set of shapefiles, which would contain both columns in their Attribute table (meaning: each shapefile would contain as “Area” field that would display the acreage of its own features, as well as an “APE_Area” field that would display the number representing the acreage of the entire APE. Before doing this, however I defined a few local attributes once more and created “PerCofAPE”. (This field was meant to display the result of the percentage equation I had mentioned previously.)
                I executed the arcpy.SpatialJoin_analysis() function on all three of my clipped disturbance polygons (combining them with my APE polygon) and called them "OFC1_join.shp", "OFC2_join.shp" and "OFC3_join.shp" respectively. I ran another arcpy.AddField_management() operation to add the “PerCofAPE” column to each of the joined polygons as a Double, saved and ran my script. When I opened up ArcMap, I saw that the created polygons were in their correct place and after viewing their attributes I saw that they contained individual columns with each of the desired numbers as well as the new, blank “PerCofAPE” field.

               The last (and most difficult) operation in my script was to populate the “PerCofAPE” field with the result of the percentage equation. (As I had mentioned above, this would be (“Area”)/(“APE_Area”)*100). I figured that the easiest way to do this was to simply perform a field calculation on all the joined shapes using the arcpy.CalculateField_management function. It looked as though the equation would yield a result (in a form of a decimal, for now) but no matter what I did I would receive an error message stating that the field in question was not nullable. This was extremely confusing, as I had made sure that all created fields throughout the script were designated as nullable. I spent a long time trying to troubleshoot the problem, but ultimately I was not able to figure it out on time. The last remaining step in the script, was supposed be to simply print these percentages in python in a legible manner. Unfortunately, no matter what I had tried the “PerCofAPE” field remained blank and not editable.
                Ultimately, this script proved to be a lot more challenging and time consuming. I had previously thought that simple calculations would be easy to figure out and troubleshoot but in this particular case it proved to be too much for me. I am proud that I was able to achieve most of what I set out to do, and if I were able to get past this particular issue, I think the script would have worked well. I had also toyed with the idea of setting the script up as a user-friendly tool and get parameters that could be set in ArcMap (like we did in our later modules) directly. But unfortunately this was impossible as I had gotten stuck on a crucial part of the script and could not move on to the final steps.


Thursday, August 9, 2018

GIS5265 - GIS Applications for Archaeology - Final Project


                Camino de Mulas was a long distance trade route stretching from the general location of San Jose, Costa Rica and ran down to David, Panama. The route was operational from around 1601CE – 1730CE, and was eventually destroyed by the building of the Panamerican Highway. While parts of it may still remain, the exact location of the route is no longer known. Historical sources provide us with a list of known locations that intersected the route at the height of its activity, and can be used to recreate the possible location of the route by performing a series of Least Cost Path analyses in ArcGiS. While there are many unknown factors that may have affected the exact direction of the route, a good estimate of its location can be generated using basic known information and a little bit of common-sense speculation in regards to the current nature of the landscape.
                I began my process by creating individual point shapefiles, and placing them at the coordinates given to us in the lab document and labeling each location by its provided name. This step gave me an idea about the size of the area as a whole, which would in turn help me locate an appropriate DEM raster. After retrieving a large DEM from the Earth Explorer website, I used the clip tool to decrease the size of the raster to reach just above the northernmost location and just below of the southernmost location. Doing this would decrease processing times for all tool operations that will need to be run when finally creating a weighted overlay raster. Because the final result would require more data than elevation, I took time to track down a land classification shapefile for the general region of Central America. I clipped the shapefile, and after taking some time to classify the land cover categories based on estimated difficulty in the features attribute table, I used the Features to Raster tool to convert it to a workable raster.
Once I had two workable rasters, I began my work to create a weighted overlay raster. I began by creating a new toolbox, and within it a new model. Much like in the first part of the final project, I began by going into the models properties, and replicating the Values set up in the previous step (Workspace, Raster Analysis and Processing Extent.) I dragged the slope tool and used the provided standard (.00000912) for my Z value. The raster input was, of course, the DEM downloaded from Earth Explorer and had the output raster was set to be created in a whole new folder designated for the second part of the project. I added the reclassify tool to the model and connected the two operations together. I recreated the same setup for my Slope raster as I had done in the first part of the lab. (10 grade categories).
After successfully running the tool, my Reclass raster was successfully created. Now, the time has come to combine the factors from both rasters to create a Weighted Overlay. I dragged the Weighted Overlay tool into my model, and set the Evaluation Scale as: from 1 to 10 by 1, as previously instructed and aligned the evaluation scales so that they match up (this time I set the NoValue field to 9 instead of 1 in attempt to give water more weight.) I added the Land Cover raster, and weighted the previously defined classes as follows: easy – 1, medium – 4, hard – 9. I set the ratio of influence as 80-20 favoring the Reclass raster and ran the tool.
As I examined my overlay product, I noticed that the entire water region showed up within an Easy-Medium range. I found that this may be problematic when predicting the Least Cost path. The lab instructions specify that Camino de Mulas was a mule route, and should not include any sort of water travel. I decided to retrace my steps and give the Land Cover raster more weight in order to contrast the results at the end. I made another Weighted Overlay raster, this time favoring the Reclass raster 60 – 40 to contrast alongside of the first raster. My hope was that by giving more weight to the Land Cover (which did not contain any part of the ocean) would help divert the least cost route away from the water for more accurate results.
I moved on to the next step, and took time to create all of the Least Cost Paths between the points. Because using a single model gave me many problems in the previous part of the lab, I decided to calculate each Least Cost Distance and Least Cost Path individually. This is where I took time to re-name my location point files by adding an order number after their name. (This will make it easier to work in order.) I used the Least Distance tool on my origin point and created a new backlist raster. Then, I used the new products to calculate the Least Distance Path between the first two points. I followed these steps for each individual path, for both of my weighted rasters until the entirety of the Camino de Mulas was mapped.
At the very end, I created two new line features, and traced over the Least Cost Path in order to make it more legible at the large scale. I added a shapefile of Central Americas road systems to account for any overlap, created a new data frame and set up my map layout.
As I examined the contrast between the two different weighted rasters, and it seems as though the 60-40 raster is more likely to be more accurate. I conclude this based on the direction the path takes around the area of Paso Real, Boruca and Palmar Sur in the 80-20 weighted raster. After approaching this area along the coastline (presumably weighed down by the water) the path cuts across Palmar Sur, and moves north to Boruca, then to Paso Real and cuts back to Palmar Sur. This seems to be counter-productive, and the historical document provided clearly states that the Railway Section near Palmar Sur ought to be the third stop of the above mentioned locations. The Least Cost Paths generated using the 60-40 Weighed Overlay Raster seem to provide a much more reasonable vision of the direction that Camino de Mulas had most likely taken.


Sunday, August 5, 2018

GIS5103 - GIS Programming - Module 11

 From the very start of this course, the very idea of coding and writing script was a bit intimidating. I had not taken a computer-related class since sixth grade, and though I enjoyed doing GIS work I did not think I would be able to grasp the finer tunings of its programs. However, after completing some three or four labs, I began to feel that working with Python was coming easier and easier. In a way, it was a little bit like learning another language: the more one uses the language, the better ones grammar and understanding becomes.
Overall, I really love the fact that I was able to dive into the “logic” of the tools used in ArcGis. Knowing common Python commands, it is now easier to understand tools 
which I had never used before. Knowing how many of them are constructed allows me to visualize how pre-made tools may be scripted, and how they may be modified into custom tools when needed. Having a better understanding of code and scripting, truly feels like there are very little limitations when using ArcMap.
One of the most useful exercises to me were the tool-creation modules at the end of the semester. From my personal experience, there are many people within the fields of archaeology or environmental science who are not classically trained GIS users, but use ArcGIS either way to perform simple tasks related to their work. Being able to create and modify script tools allows me to help these GIS users 
in a way that would be useful to 
them, but does not require them to 
spend a lot of time training.

Wednesday, August 1, 2018

GIS5103 - GIS Programming - Module 10

This weeks lab focused on creating a custom tool based on an already existing script. This was done mostly within ArcCatalog in a blank ArcMap project. The necessary parameters could be set by pulling up the script properties, within the correct toolbox.



The script had to be modified in order to make the tool compatible with various inputs that the GiS user chooses each time the tool is ran. All file paths, and feature classes have to be generalized, and simply defined as parameters within a tool using the GetParameters() tool in Python.



Thursday, July 26, 2018

GIS5103 - GIS Programming - Module 9

This module focused heavily on working with rasters. After creating many temporary rasters within the script, I combined the six of them to create a new raster file which shows specific degrees of slope and aspect.



1.       There are two steps near the end that gave me a bit of trouble in this exercise. The first being the algebraic equations. I tried to combine both of the slope calculations by combining them into a single line trying to separate them by the “and” command. For example, goodslope = slope <20 and slope >5. This gave me errors, and after looking ahead in the instructions I realized that these operations had to be done separately. After splitting the slope and aspect calculations into four separate steps, I used the four temporary rasters and combined them all together to create the final product.
2.       At the very end, I also encountered an issue when trying to run my finalized script. I initially saved each separate new raster to a permanent one with the save function. Each one of these saves gave me an error, which stated that the file I was trying to create already existed. This seemed strange to me as the overwrite output was already set. In order to make the script work I had to delete the save functions. It ran fine right afterwards.

Friday, July 20, 2018

GIS5265 - GIS Applications for Archaeology - Module 9






This weeks module focused on remote sensing, and its capabilities within the field of archaeology. The map on the left shows the general area of Cahokia State Park. The raster was classified using supervised classification into seven separate categories. The extent indicator shows the immediate area of Monks Mound, the archaeological feature.



The map to the right, shows the results received when filtering the aerial raster through the unsupervised classification procedure. In my case, this method (which did not involve me creating classification points) turned out to be more accurate. There is also an extra category which does not apply to any of the fields. I called it simply "Unclassified"

Wednesday, July 18, 2018

GIS5103 - GIS Programming - Peer Review 2

Amadeusz Zajac
GIS Programming
GIS 5103
7/18/2018

"Peer Review of Silva and Taborda (2013)"

The Silva and Taborda (2013) paper discusses the use of BeachMM (Beach Morphodynamic Model) tool, and the advantages which it may provide in creating an easy to use and reliable predictive model for beach morphodynamics. The tool comes provided with ArcGis 10, and can easily be found and accessed in the spatial analyst toolbox. Created using the Python scripting language (due mainly to Pythons strong compatibility with the ArcGis software), the BeachMM tool combines the potential of the already existing wave/morphological numerical models of SWAN and XBeach. Because these particular models require a higher level of skill, and lack the more user friendly capabilities of ArcGis, the BeachMM tool scripts main focus was to simplify and accelerate the necessary process for creating predictive models.
The Silva and Taborda (2013) paper rightfully begins by defining capabilities already provided in both: the SWAN and XBeach applications. The purpose of the SWAN model, is mainly to predict the wave size and behavior based on known physical properties of the study region. First and foremost, the shape of the ocean floor as well as any unique, known geographical features can have a considerable impact on the behavior and formation of waves. SWAN also takes into consideration non-constant metrics such as wind direction and intensity, in order to make its prediction. While this is clearly an important factor, it seems to me that an accurate prediction depends greatly on the availability of accurate, meteorological information at the time of the given weather event.
The XBeach model is described as a tool used to evaluate a change to beach geomorphology, specifically when considering a specific natural event, such as a storm. The tool provides reliable predictions due to changes in beach landscape via dune erosion, overwash and breaching, as has been demonstrated on several different occasions in various European countries.
The Silva and Taborda (2013) article moves on to explain the mechanics behind the BeachMM tool, and explains the significance of the order of operations when drawing from the above mentioned model data. The tool can be divided into four basic stages of operation: The BeachMM Python tool implements four main tasks. The first, being the conversion between different bathymetric raster formats, where the data from SWAN and XBeach are made compatible into a single operation. The second being the creation of model input parameter files, according to both user input and bathymetric beach properties. The latter would save some time and confusion for a Gis user who is not normally familiar with SWAN and XBeach. Third step is saving the output data in an ArcGis friendly format, and finally call external models to run within ArcGis.
The paper specifically states, that the BeachMM tool was shown to be effective in reducing work time, increasing the user-friendliness of the above mentioned tools, and increasing the number of platforms that can run operation (specifically ArcGis). Silva and Taborda (2013) do however make a point to state that the reliability of the results hugely depends on the accuracy and quality of the SWAN and XBeach data already provided. I think that this is an extremely important point, and very important to consider for those who are not familiar with working with these models. It is suggested that in order for the BeachMM tool to be proven to work indefinitely, there is a necessity of constantly testing the hypothesis to truly understand the tools reliability.
While the misuse of the BeachMM tool is said to have been minimized by "including in the user interface only the parameters needed for the operational use, being the calibration parameters included in a concealed file only handled by SWAN or XBeach experienced users" the tool is still implied to have troubleshooting drawbacks, and verification of the results still strongly depends on individuals capable of interpreting SWAN and XBeach data.

(link: https://hostsited2l.uwf.edu/content/enforced/1024185-51499GIS5103201805/SupplementalReadings/PotentialsForPeerReview/Silva_et_al_2013.pdf?_&d2lSessionVal=Vtck5anLgZVBTLIENBNOMIwsc&ou=1024185)

Friday, July 13, 2018

GIS5265 - GIS Applications for Archaeology - Module 8



1- This weeks exercise focused on ArcScene, and attempting to create a 3D model based on a series of point shapefiles, containing depth information for sgtrat depths within each shovel test pit represented by a point on a 2d surface.

2- In ArcScene, I took the surface interpolations for all three of the strata which were generated in the first part of the assignment, and edited the extrusion and base settings in order to display an average strat surface on a 3D model.



3- For the third part of the assignment, we created a flyby video of the project.



Link to the flyby video of the Module 8

4- Finally, I created 3D layer data for a point shapefile representing a disturbance cable which cuts across the study area of the site. The stratum depths cut by the disturbance were color coated differently from the STP data.


Friday, July 6, 2018

GIS5265 - GIS Applications for Archaeology - Module 7


This weeks lab focused on various techniques of surface interpolation. The initial assignment above, shows  the five basic techniques that we performed using ArcToolbox to perform the given study on the Barillis site, while the three examples at the bottom show the three techniques that we had ran on data for the Machalilla site in Ecuador. The latter data came in the AutoCAD format, and required manipulation of a basic text file in WIndows Excel in order to display. The coordinate system remains unknown, but I was able to get the data to comply with a few basic surface interpolation operations.
This extra poster shows yet another surface interpolation operation performed on the Machalilla site data, this time using the IDW tool. The various images show the outcome of running the tool at various power settings. It seemed to me that the tool begins to be distorted once the power is set to well over 100, but the most thorough study seems to be at the powers of 50-100. 


Breakdown of the Methodology

Despite having used five surface interpolation techniques, (all of which performed on the same archaeological sites) it is still difficult to confidently state which is right for any given project. I think it is reasonable to state that each of these surface interpolation tools could have different advantages and disadvantages in estimating a site surface on archaeological sites.

The inverse distance weighted tool (IDW) works by estimating the potential of a general area based on the known value of a given attribute. The "weight" of the color is based off the point containing most densely populated attribute in question. For example; series of positive shovel tests on a site will show as an overall positive area, with the most artifact-rich test most likely ending up around the center. This is a very useful generalization for deciding which area ought to be more tested, but when dealing with a small, isolated feature the probability can become completely distorted on the map.
The result shown after the spline tool is ran expresses a diminished surface curvature and results in a smooth overlay that passes through relevant points. I don't see this being a preferred interpolation method for finding locations of settlements in most cases. In large APEs however, the spline tool may make it easier to see consistency in distance between points.

Both the positive and negative aspects of the Natural Neighbor technique are the fact that the tool tends to generalize trends for the presence of an attribute based of the nearest input features. The positive side of this, being that the tool may overlook possible outlier tests that may have no relevance. Inversely however, it is beneficial to know where highest artifact concentrations are located as they are generally indicative of more human activity.
The Krieging method works by generalizing surface trends based off the z-value of data points in question. The degree of accuracy when dealing with low numbers tends to be quite high. The main disadvantage of the Krieging method however, is that points with high values tend to lose a bit of accuracy despite the fact that they tend to be the main points of interest in archaeology.

The Kernel Density tool takes into account the weight of all specific features (line and/or point) and shows more weight based on higher potency. This method seems to be the most advantageous for archaeologists seeking locations of sites, but the areas do seem more generalized, and weight edges seem to be less sharp than those in tools such as IDW or Natural Neighbor.


Spatial Analyst Potential



The Spatial Analyst toolset provides a great arsenal for statistical analysis within many different fields of study. In archaeology alone these tools can provide us with estimated boundaries of settlements and sites, and/or distributions of specific artifact classes. This information can be extremely useful in estimating a cost of later phases of work in CRM, and obtaining precise spatial information to clients in order to minimize impact from construction.

Sunday, July 1, 2018

GIS5103 - GIS Programming - Module 6

This module emphasized the use of several tools through a python script. For the given assignment, I had to create a script from scratch that would recognize a workplace on the computer hard drive, and from this workspace it would perform the following functions:

-Add XY to a specific shapefile
-Create a 100 meter buffer around the points on the shapefile
-Dissolve all attributes of the buffers and turn the shapefile into a single feature.

I did my best to make the script as well as the results legible. The screenshot above shows the messages received after the tool had been ran.

Tuesday, June 19, 2018

GIS5103 - GIS Programming - Module 5



This weeks module focused on geoprocessing, specifically geoprocessing using tools available in ArcMap. We created a model, which utilized three different tools: clip, select and erase. Then, we exported the model in the form of a python script and modified it so that it ran via PythonWin. The above screenshot shows the successful final product of the assignment.

Friday, June 15, 2018

GIS5103 - GIS Programming - Peer Review of Bakker et al 2016

Peer review of Bakker et al 2016

The Bakker et al 2016 paper focuses on a Python-based tool called FloPy, which is meant specifically for creating predictive models of groundwater flow. The uses of FloPy are presented as an alternative for commonly used GUIs (Graphic User Interface) to predict the spread of ground water, taking into account many different environmental factors that may impact it. The paper takes time to explain all mechanisms within the code, step-by-step, and thoroughly explains the logic of each line contained within the code. It uses specific examples from the script, as well as several graphs and charts to highlight the potential advantage of using the seemingly old fashioned scripting method. I myself have zero training in hydrology, and my peer review of this paper can only be done from the perspective of a novice Python user. My personal interest lies in the fact that the scripting mechanisms used in creating a predictive model for groundwater flow in FloPy could be used for predictive modeling across various other disciplines.

In the introduction, Bakker et al 2016 describe in short the mechanisms behind a typical GUI. The process is presented clearly, and understandable even without much prior knowledge about hydrology or predictive modeling. This helps quite a bit in being able to really understand the advantages of FloPy over the currently more popular GUIs. The paper also explains that the Python language was chosen specifically, because it has a very powerful syntax, and many tools at the programmers disposal (thus earning its popularity in scientific circles.)

The next portion of the study discusses a specific example of a FloPy use. The authors do a great job explaining the function of the tool step-by-step, which allows for a thorough understanding of its functions. As with every script, there are a set of attributes to be defined at the very beginning, including but not limited to; geographical boundaries, water level depth, and periodical water fluctuation. Specific instructions are provided for how varying metrics ought to be filled out in order for the model to run appropriately. The next part of the paper focuses on more advanced applications of FloPy, and describes how individual scripters can go about adding unique metrics that would need to be considered in specialized settings that would apply to more complicated study areas. Several specific studies that used the FloPy tactic are cited and highlighted, which in its own way serves as the testing of the papers initial hypothesis.

In conclusion, the advantages of using FloPy over current GUI methods are several. First, the time of processing the model tends to be twice as fast. Second, a script automatically provides a record of each specific run, which can itself be peer reviewed, modified and re-tested. Third, the capabilities of Python allow for a more specialized predictive model than currently available with the more popular GUIs. The Bakker et al 2016 paper presented a hypothesis clearly, demonstrated that it had been tested on several occasions, and drew logical conclusions based on the results of said tests.


Link: https://hostsited2l.uwf.edu/content/enforced/1024185-51499GIS5103201805/SupplementalReadings/PotentialsForPeerReview/Bakker_et_al-2016-Groundwater.pdf?_&d2lSessionVal=OhmXFvyMsrT2qJz3PCvegYrVo&ou=1024185