Thursday, July 26, 2018

GIS5103 - GIS Programming - Module 9

This module focused heavily on working with rasters. After creating many temporary rasters within the script, I combined the six of them to create a new raster file which shows specific degrees of slope and aspect.



1.       There are two steps near the end that gave me a bit of trouble in this exercise. The first being the algebraic equations. I tried to combine both of the slope calculations by combining them into a single line trying to separate them by the “and” command. For example, goodslope = slope <20 and slope >5. This gave me errors, and after looking ahead in the instructions I realized that these operations had to be done separately. After splitting the slope and aspect calculations into four separate steps, I used the four temporary rasters and combined them all together to create the final product.
2.       At the very end, I also encountered an issue when trying to run my finalized script. I initially saved each separate new raster to a permanent one with the save function. Each one of these saves gave me an error, which stated that the file I was trying to create already existed. This seemed strange to me as the overwrite output was already set. In order to make the script work I had to delete the save functions. It ran fine right afterwards.

Friday, July 20, 2018

GIS5265 - GIS Applications for Archaeology - Module 9






This weeks module focused on remote sensing, and its capabilities within the field of archaeology. The map on the left shows the general area of Cahokia State Park. The raster was classified using supervised classification into seven separate categories. The extent indicator shows the immediate area of Monks Mound, the archaeological feature.



The map to the right, shows the results received when filtering the aerial raster through the unsupervised classification procedure. In my case, this method (which did not involve me creating classification points) turned out to be more accurate. There is also an extra category which does not apply to any of the fields. I called it simply "Unclassified"

Wednesday, July 18, 2018

GIS5103 - GIS Programming - Peer Review 2

Amadeusz Zajac
GIS Programming
GIS 5103
7/18/2018

"Peer Review of Silva and Taborda (2013)"

The Silva and Taborda (2013) paper discusses the use of BeachMM (Beach Morphodynamic Model) tool, and the advantages which it may provide in creating an easy to use and reliable predictive model for beach morphodynamics. The tool comes provided with ArcGis 10, and can easily be found and accessed in the spatial analyst toolbox. Created using the Python scripting language (due mainly to Pythons strong compatibility with the ArcGis software), the BeachMM tool combines the potential of the already existing wave/morphological numerical models of SWAN and XBeach. Because these particular models require a higher level of skill, and lack the more user friendly capabilities of ArcGis, the BeachMM tool scripts main focus was to simplify and accelerate the necessary process for creating predictive models.
The Silva and Taborda (2013) paper rightfully begins by defining capabilities already provided in both: the SWAN and XBeach applications. The purpose of the SWAN model, is mainly to predict the wave size and behavior based on known physical properties of the study region. First and foremost, the shape of the ocean floor as well as any unique, known geographical features can have a considerable impact on the behavior and formation of waves. SWAN also takes into consideration non-constant metrics such as wind direction and intensity, in order to make its prediction. While this is clearly an important factor, it seems to me that an accurate prediction depends greatly on the availability of accurate, meteorological information at the time of the given weather event.
The XBeach model is described as a tool used to evaluate a change to beach geomorphology, specifically when considering a specific natural event, such as a storm. The tool provides reliable predictions due to changes in beach landscape via dune erosion, overwash and breaching, as has been demonstrated on several different occasions in various European countries.
The Silva and Taborda (2013) article moves on to explain the mechanics behind the BeachMM tool, and explains the significance of the order of operations when drawing from the above mentioned model data. The tool can be divided into four basic stages of operation: The BeachMM Python tool implements four main tasks. The first, being the conversion between different bathymetric raster formats, where the data from SWAN and XBeach are made compatible into a single operation. The second being the creation of model input parameter files, according to both user input and bathymetric beach properties. The latter would save some time and confusion for a Gis user who is not normally familiar with SWAN and XBeach. Third step is saving the output data in an ArcGis friendly format, and finally call external models to run within ArcGis.
The paper specifically states, that the BeachMM tool was shown to be effective in reducing work time, increasing the user-friendliness of the above mentioned tools, and increasing the number of platforms that can run operation (specifically ArcGis). Silva and Taborda (2013) do however make a point to state that the reliability of the results hugely depends on the accuracy and quality of the SWAN and XBeach data already provided. I think that this is an extremely important point, and very important to consider for those who are not familiar with working with these models. It is suggested that in order for the BeachMM tool to be proven to work indefinitely, there is a necessity of constantly testing the hypothesis to truly understand the tools reliability.
While the misuse of the BeachMM tool is said to have been minimized by "including in the user interface only the parameters needed for the operational use, being the calibration parameters included in a concealed file only handled by SWAN or XBeach experienced users" the tool is still implied to have troubleshooting drawbacks, and verification of the results still strongly depends on individuals capable of interpreting SWAN and XBeach data.

(link: https://hostsited2l.uwf.edu/content/enforced/1024185-51499GIS5103201805/SupplementalReadings/PotentialsForPeerReview/Silva_et_al_2013.pdf?_&d2lSessionVal=Vtck5anLgZVBTLIENBNOMIwsc&ou=1024185)

Friday, July 13, 2018

GIS5265 - GIS Applications for Archaeology - Module 8



1- This weeks exercise focused on ArcScene, and attempting to create a 3D model based on a series of point shapefiles, containing depth information for sgtrat depths within each shovel test pit represented by a point on a 2d surface.

2- In ArcScene, I took the surface interpolations for all three of the strata which were generated in the first part of the assignment, and edited the extrusion and base settings in order to display an average strat surface on a 3D model.



3- For the third part of the assignment, we created a flyby video of the project.



Link to the flyby video of the Module 8

4- Finally, I created 3D layer data for a point shapefile representing a disturbance cable which cuts across the study area of the site. The stratum depths cut by the disturbance were color coated differently from the STP data.


Friday, July 6, 2018

GIS5265 - GIS Applications for Archaeology - Module 7


This weeks lab focused on various techniques of surface interpolation. The initial assignment above, shows  the five basic techniques that we performed using ArcToolbox to perform the given study on the Barillis site, while the three examples at the bottom show the three techniques that we had ran on data for the Machalilla site in Ecuador. The latter data came in the AutoCAD format, and required manipulation of a basic text file in WIndows Excel in order to display. The coordinate system remains unknown, but I was able to get the data to comply with a few basic surface interpolation operations.
This extra poster shows yet another surface interpolation operation performed on the Machalilla site data, this time using the IDW tool. The various images show the outcome of running the tool at various power settings. It seemed to me that the tool begins to be distorted once the power is set to well over 100, but the most thorough study seems to be at the powers of 50-100. 


Breakdown of the Methodology

Despite having used five surface interpolation techniques, (all of which performed on the same archaeological sites) it is still difficult to confidently state which is right for any given project. I think it is reasonable to state that each of these surface interpolation tools could have different advantages and disadvantages in estimating a site surface on archaeological sites.

The inverse distance weighted tool (IDW) works by estimating the potential of a general area based on the known value of a given attribute. The "weight" of the color is based off the point containing most densely populated attribute in question. For example; series of positive shovel tests on a site will show as an overall positive area, with the most artifact-rich test most likely ending up around the center. This is a very useful generalization for deciding which area ought to be more tested, but when dealing with a small, isolated feature the probability can become completely distorted on the map.
The result shown after the spline tool is ran expresses a diminished surface curvature and results in a smooth overlay that passes through relevant points. I don't see this being a preferred interpolation method for finding locations of settlements in most cases. In large APEs however, the spline tool may make it easier to see consistency in distance between points.

Both the positive and negative aspects of the Natural Neighbor technique are the fact that the tool tends to generalize trends for the presence of an attribute based of the nearest input features. The positive side of this, being that the tool may overlook possible outlier tests that may have no relevance. Inversely however, it is beneficial to know where highest artifact concentrations are located as they are generally indicative of more human activity.
The Krieging method works by generalizing surface trends based off the z-value of data points in question. The degree of accuracy when dealing with low numbers tends to be quite high. The main disadvantage of the Krieging method however, is that points with high values tend to lose a bit of accuracy despite the fact that they tend to be the main points of interest in archaeology.

The Kernel Density tool takes into account the weight of all specific features (line and/or point) and shows more weight based on higher potency. This method seems to be the most advantageous for archaeologists seeking locations of sites, but the areas do seem more generalized, and weight edges seem to be less sharp than those in tools such as IDW or Natural Neighbor.


Spatial Analyst Potential



The Spatial Analyst toolset provides a great arsenal for statistical analysis within many different fields of study. In archaeology alone these tools can provide us with estimated boundaries of settlements and sites, and/or distributions of specific artifact classes. This information can be extremely useful in estimating a cost of later phases of work in CRM, and obtaining precise spatial information to clients in order to minimize impact from construction.

Sunday, July 1, 2018

GIS5103 - GIS Programming - Module 6

This module emphasized the use of several tools through a python script. For the given assignment, I had to create a script from scratch that would recognize a workplace on the computer hard drive, and from this workspace it would perform the following functions:

-Add XY to a specific shapefile
-Create a 100 meter buffer around the points on the shapefile
-Dissolve all attributes of the buffers and turn the shapefile into a single feature.

I did my best to make the script as well as the results legible. The screenshot above shows the messages received after the tool had been ran.