Thursday, July 28, 2016

Exercise 9: Create a Script Tool

Goals and Objectives

The purpose of this exercise was to create a custom script tool that will generate a swiss hillshade.  

Methods

In this exercise, a custom script tool was created by using Python and ArcMap.  For the scripting portion of this exercise, comments were used to annotate each section of code. The following steps were taken: 


  • importing system modules
  • importing Python modules: os, time, datetime, shutil, ArcGIS extensions
  • setting up local variables
  • setting up try except loop
  • adding workflow to loop: divide DEM, calculate a hillshade, run focal statistics, add DEM
  • setting up except statement to print an error message
  • creating custom toolbox in ArcMap
  • using the Add Script Wizard to add a custom script tool
  • debugging and running script
  • testing tool
  • creating map in ArcMap

Results

The final script and map displaying the results are displayed below.  

Figure 1: Map displaying swiss hillshades






Exercise 8: Raster Processing with Booleans and Loops

Goals and Objectives

The purpose of this exercise was to use a suite of loops to clip and project a directory of scanned USGS topographic sheets into a seamless raster using Python.  The study area was Kettle Moraine. 


Methods

In this exercise, four scanned USGS topographic sheets were manipulated using Python to create a raster.  While developing this script, comments were used to annotate each section of code.  The following steps were taken: 


  • importing system modules
  • importing Python modules: os, time, datetime, shutil, ArcGIS extensions
  • creating variables
  • creating spatial reference object using coordinate system of a tile index
  • using IF statement to set up loop
  • adding all files of specified criteria to raster list
  • setting up processing loop for each raster
  • formatting raster names
  • projecting all rasters
  • creating polygon of raster's topographic footprint
  • merging tiles
  • creating IF statement to set up a FOR IN loop
  • creating search cursor
  • clipping rasters
  • merging rasters together
  • debugging and running script
  • creating map in ArcMap


Results

The final map and script are shown below.  

Figure 1: map displaying four merged USGS topo quads








Monday, July 25, 2016

Exercise 7: Risk Model for Landslide Susceptibility in Oregon

Goals and Objectives

The purpose of this exercise was to generate a risk model for landslides in Oregon using Python scripting.  


Methods

In this exercise, the data from Exercise 5 and Exercise 6 was used to create a risk model for landslides in Oregon with the help of Python.  While developing the script, comments were utilized to annotate each section of code.  The following steps were taken:


  • importing system modules
  • importing Python modules: os, time, datetime, shutil, ArcGIS extensions
  • creating smart variables and a list
  • creating a fishnet
  • buffering roads
  • intersecting road buffer and fishnet
  • creating variables for the reclass values of slope and landcover
  • applying the reclass values using the reclassify tool
  • multiplying the rasters together to create a risk raster
  • using zonal statistics as a table to tally risk values
  • joining results 
  • debugging and running script
  • creating a map in ArcMap


Results

The final script and map displaying my results are shown below.  


Figure 1: Map showing risk model results


#-------------------------------------------------------------------
# Name:        Ex7
# Purpose:    Generate a risk model for landslides in Oregon
#
# Author:      williaan
#
# Created:     25/07/2016
# Copyright:   (c) williaan 2016

#-------------------------------------------------------------------


#import system modules
import os
import shutil
import time
import datetime
import arcpy
from arcpy import env
from arcpy.sa import *
from arcpy.ddd import *

#allow the script to overwrite existing files
arcpy.env.workspace = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb"
arcpy.env.overwriteOutput = True
#setting output CS to NAD 1983 HARN Oregon Statewide Lambert
arcpy.env.outputCoordinateSystem = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\OregonBoundary"
arcpy.CheckOutExtension('3D')
arcpy.CheckOutExtension('spatial')

#create variables
geodatabasepath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb"
featureDatasetPath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\OregonFCs"

majorRoads = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\OregonFCs\MajorRoads_NW"
landslidePts = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\OregonFCs\LandslidePoints_NW"
precipData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\NW_Oregon_PrecipitationData_800m"
landUseData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\NLCD2011_Oregon_NW"
slopeData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\NW_Oregon_Slope_30m"

boundaryName = "ProcessingBoundary"
fishnetName = "Oregon_NW_1kmFishnet"
roadsBufferName = "MajorRoads_NW_Buffered"
roadsBuffIntName = "MajorRoads_NW_Buffer_1kmGrid"
riskRasterName = "NW_Oregon_LandslideRisk"
pointName = os.path.basename(landslidePts).rstrip(os.path.splitext(landslidePts)[1])

processingBoundary = os.path.join(featureDatasetPath,boundaryName)
outputFishnet = os.path.join(featureDatasetPath,fishnetName)
roadBufferOutput = os.path.join(featureDatasetPath,roadsBufferName)
roadBufferGridOutput = os.path.join(featureDatasetPath,roadsBuffIntName)
slopeReclassRaster = slopeData + "_Reclass"
lulcReclassRaster = landUseData + "_Reclass"

landslideRiskRaster = os.path.join(geodatabasepath,riskRasterName)
landslideRiskStatistics = landslideRiskRaster + "_Stats"

#field names
bufferedRoadsUniqueIDField = "FID_Oregon_NW_1kmFishnet"

#create processing boundary by extracting the footprint from the slope raster
print "Creating processing boundary" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.RasterDomain_3d(slopeData,processingBoundary,"POLYGON")

#create fishnet (Grid) with 1km spacing
#variables for fishnet
desc = arcpy.Describe(processingBoundary)
gridSize = 1000.0

print "Creating fishnet" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.CreateFishnet_management(outputFishnet,str(desc.extent.lowerLeft),str(desc.extent.XMin) + " " +str(desc.extent.YMax+10),gridSize,gridSize,"0","0",str(desc.extent.upperRight),"NO_LABELS",processingBoundary,"POLYGON")

#buffering
arcpy.Buffer_analysis(majorRoads,roadBufferOutput,"100 Meters","FULL","ROUND","ALL","","PLANAR")

#intersecting roadway buffer and fishnet
print "Intersecting" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Intersect_analysis([roadBufferOutput,outputFishnet],roadBufferGridOutput,"ALL","","INPUT")

#set reclass values for slope and landcover
#this string specifies different value ranges and the values these ranges will be assigned
remapStringSlope  = "0 1.484953 1;1.484954 6.184316 2;6.184317 10.883696 3;10.883697 15.583077 4;15.583078 29.681219 5;29.681220 34.380600 4; 34.380601 39.079981 3;39.079982 43.779362 2;43.779363 90 1"

#this string specifies risk values to the different nlcd classes based on their values; open water has a raster value of 11, and is 1 on the risk scale; developed open space has a raster value of 11 and risk value of 4
remapStringNLCD = "11 1;21 4;22 4;23 3;24 1;31 2;41 4;42 5;43 5;52 5;71 4;81 3;82 3;90 3;95 1"

print "Reclassifying slope into individual run type classes" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Reclassify_3d(slopeData,"VALUE",remapStringSlope,slopeReclassRaster,"NODATA")

print "Reclassifying landuse into individual run type classes" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Reclassify_3d(landUseData,"Value",remapStringNLCD,lulcReclassRaster,"NODATA")

print "Multiplying the reclassed rasters to obtain a risk value"
arcpy.Times_3d(slopeReclassRaster,lulcReclassRaster,landslideRiskRaster)

#zonal stats as a table
arcpy.gp.ZonalStatisticsAsTable_sa(roadBufferGridOutput,bufferedRoadsUniqueIDField,landslideRiskRaster,landslideRiskStatistics,"DATA","MEDIAN")

#joining field
arcpy.JoinField_management(roadBufferGridOutput,bufferedRoadsUniqueIDField,landslideRiskStatistics,bufferedRoadsUniqueIDField,"Median")

print "Script is complete."

Exercise 6: Analyzing Raster Data for Landslide Susceptibility in Oregon

Goals and Objectives

The purpose of this exercise is to use a suite of raster and vector tools to identify common characteristics of landslides in Oregon.  Characteristics examined included precipitation, slope and land use/land cover.


Methods

In this exercise, raster data concerning the state of Oregon was used to examine landslide characteristics with the help of Python.  While developing this script, comments were utilized to annotate each section of code.  The following steps were taken: 

  • importing system modules
  • importing Python modules: os, time, datetime, shutil, ArcGIS extensions
  • creating smart variables and a list
  • adding raster values from precipitation, slope and land use to point feature class
  • adding and calculating a field to determine buffer radius
  • buffering landslides
  • using zonal statistics to calculate median slope of buffered landslides
  • joining results table to buffered landslides
  • creating and using a search and update cursor to update null values
  • calculating summary statistics for raster values of precipitation, slope and slide area based on movement class, land cover type and the combination of these two factors
  • using the tabulate area tool to calculate how much of each buffer falls within different analysis classes
  • exporting table to MS Excel
  • using a loop to delete intermediate feature classes
  • debugging and running script

Results

The final script is displayed below. 

#-------------------------------------------------------------------
# Name:        Ex 6
# Purpose: To model landslide susceptibility in Oregon
#
# Author:      williaan
#
# Created:     25/07/2016
# Copyright:   (c) williaan 2016
#
#-------------------------------------------------------------------

#import system modules
import os
import shutil
import time
import datetime
import arcpy
from arcpy import env
from arcpy.sa import *
from arcpy.ddd import *

#allow the script to overwrite existing files
arcpy.env.workspace = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6"
arcpy.env.overwriteOutput = True
#setting output CS to NAD 1983 HARN Oregon Statewide Lambert
arcpy.env.outputCoordinateSystem = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\OregonBoundary"
arcpy.CheckOutExtension('3D')
arcpy.CheckOutExtension('spatial')

#create variables
geodatabasepath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb"
featureDatasetPath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\OregonFCs"
otherDataFolder = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6"

landslidePts = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\OregonFCs\LandslidePoints_NW"
precipData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\NW_Oregon_PrecipitationData_800m"
landUseData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_6\Ex6.gdb\NLCD2011_Oregon_NW"
slopeData = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\NW_Oregon_Slope_30m"


#naming conventions
print "Setting up naming conventions" ,datetime.datetime.now().strftime("%H:%M:%S")
pointName = os.path.basename(landslidePts).rstrip(os.path.splitext(landslidePts)[1])
pointSelectOutput = os.path.join(geodatabasepath,pointName) + "_Selected"
pointsWithLandUse = os.path.join(geodatabasepath,pointName) + "_LU"
pointsWithSlope = os.path.join(geodatabasepath,pointName) + "_Slope"
pointsWithPrecip = os.path.join(geodatabasepath,pointName) + "_Precip"

pointsBuffered = pointSelectOutput + "_Buffer"

bufferedSlopeStatistics = pointsBuffered + "_Slope"
bufferedSlideTypeSummary = pointsBuffered + "_Summary_SlideType"
bufferedLandUseSummary = pointsBuffered + "_Summary_LULC"
bufferedBothSummary = pointsBuffered + "_Summary_Both"
bufferedTabulateLULC = pointsBuffered + "_TabulateLULC"

tabulateLULC_ExcelTable = os.path.join(otherDataFolder,os.path.basename(bufferedTabulateLULC).rstrip(os.path.splitext(bufferedTabulateLULC)[1])) + ".xls"

#creating list of feature classes to be created
print "Creating list of feature classes" ,datetime.datetime.now().strftime("%H:%M:%S")
createdFCs = []
createdFCs.append(pointSelectOutput)
createdFCs.append(pointsWithLandUse)
createdFCs.append(pointsWithSlope)
createdFCs.append(pointsWithPrecip)

#field names
uniqueIDField = "UNIQUE_ID"
bufferDistanceField = "BufferDistance"
slideLengthmFieldName = 'Slide_Lenth_m'
rasterValueFieldName = "RASTERVALU"
lulcFieldName = "NLCD_2011"
moveClassFieldName = "MOVE_CLASS"
pointSlopeFieldName = 'Slope_Pt'
pointPrecipFieldName = 'PRECIP_mm'
bufferedSlopeFieldName = 'Slope_Buffered_Mean'
calculatedSlopeFieldName = 'Slope_Mean_Calculated'

#processing of data will begin
#selecting landslides that have width and length greater than 0 and are in the debris flow, earth flow, or flow mvmt classes
print "Selecting landslides" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Select_analysis(landslidePts,pointSelectOutput, "LENGTH_ft <>0 AND WIDTH_ft <>0 AND MOVE_CLASS IN ('Debris Flow', 'Earth Flow', 'Flow')")

#add land use class info to the landslide points only for their specific location
ExtractValuesToPoints(pointSelectOutput,landUseData,pointsWithLandUse,"NONE","ALL")
arcpy.AlterField_management(pointsWithLandUse,rasterValueFieldName,"LULC_Value", "LULC Value")
#add slope info to the landslide points using interpolation
ExtractMultiValuesToPoints(pointsWithLandUse,[[slopeData,pointSlopeFieldName],[precipData,pointPrecipFieldName]],"BILINEAR")

#add field
arcpy.AddField_management(pointsWithLandUse,bufferDistanceField, "FLOAT", "","","","","NULLABLE", "NON_REQUIRED","")
#calculate field
arcpy.CalculateField_management(pointsWithLandUse,bufferDistanceField, "(!LENGTH_ft!+!WIDTH_ft!)/2 *0.3048","PYTHON_9.3","")
#add field
arcpy.AddField_management(pointsWithLandUse,slideLengthmFieldName, "FLOAT", "","","","","NULLABLE", "NON_REQUIRED","")
#calculate field
arcpy.CalculateField_management(pointsWithLandUse,slideLengthmFieldName, "(!LENGTH_ft!+!WIDTH_ft!)*0.3048","PYTHON_9.3","")

#buffer landslides
print "Buffering landslides" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Buffer_analysis(pointsWithLandUse,pointsBuffered,bufferDistanceField,"FULL", "ROUND", "NONE", "", "PLANAR")

#zonal stats as table
arcpy.gp.ZonalStatisticsAsTable_sa(pointsBuffered,uniqueIDField,slopeData,bufferedSlopeStatistics, "DATA", "ALL")

#join field
arcpy.JoinField_management(pointsBuffered,uniqueIDField,bufferedSlopeStatistics,uniqueIDField,"Min;Max;Range;Mean;Std")

#alter the joined field name for easier identification
arcpy.AlterField_management(pointsBuffered,"MEAN",bufferedSlopeFieldName,bufferedSlopeFieldName)

#add field for the search and update cursor
arcpy.AddField_management(pointsBuffered,calculatedSlopeFieldName, "FLOAT", "","","","", "NULLABLE", "NON_REQUIRED","")


#create and use an update cursor to set the slope value for all that are not null, and the point value for features that have a null buffered value
print "Setting values based on buffered and point slope" ,datetime.datetime.now().strftime("%H:%M:%S")
listOfFields = [pointSlopeFieldName,bufferedSlopeFieldName,calculatedSlopeFieldName]

with arcpy.da.UpdateCursor(pointsBuffered,listOfFields) as cursor:
    for row in cursor:
        if(row[1]==None):
            #if the buffered slope value [1] is null (None), set the value for the calculated slope [2] to the point value [0]
            print "Setting value to the point's slope value" ,datetime.datetime.now().strftime("%H:%M:%S")
            row[2]=row[0]
            cursor.updateRow(row)
        else:
                #if the buffered slope value [1] is valid, set the value for the calculated slope [2] to the buffered value [1]
                print "Setting value to the buffer's slope value" ,datetime.datetime.now().strftime("%H:%M:%S")
                row[2]=row[1]
                cursor.updateRow(row)
del cursor


#calculate summary stats for precip, slope, and slide area based on mvmt class
print "Calculating summary stats for precip, slope and slide area based on mvmt class" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Statistics_analysis(pointsBuffered,bufferedSlideTypeSummary,[[pointPrecipFieldName,"MEAN"],[pointSlopeFieldName,"MEAN"],[bufferDistanceField,"MEAN"],[slideLengthmFieldName,"MEAN"]],moveClassFieldName)

#calculate summary stats for precip, slope and slide area based on LULC class
print "Calculating summary stats for precip, slope and slide area based on LULC" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Statistics_analysis(pointsBuffered,bufferedBothSummary,[[pointPrecipFieldName,"MEAN"],[pointSlopeFieldName,"MEAN"],[slideLengthmFieldName,"MEAN"]],[lulcFieldName,moveClassFieldName])

#tabulate area
print "Tabulating the area of each LULC class for each buffered point" ,datetime.datetime.now().strftime("%H:%M:%S")
TabulateArea(pointsBuffered,uniqueIDField,landUseData,lulcFieldName,bufferedTabulateLULC,15)

#export table to excel for additional analysis
arcpy.TableToExcel_conversion(bufferedTabulateLULC,tabulateLULC_ExcelTable)

#delete intermediate feature classes
for fc in createdFCs:
    if arcpy.Exists(fc):
        print fc + "Exists, and will now be deleted"
        arcpy.Delete_management

print "The script is complete."

Monday, July 18, 2016

Exercise 5: Preparing Raster Data for Analysis Using Lists & Loops

Goals and Objectives

The purpose of this exercise was to gain experience with using standard raster geoprocessing and raster analysis tools.  This included project, clip, hillshade, slope, creating a list of rasters and a FOR IN loop.  


Methods

In this exercise, raster data about the state of Oregon was prepared for analysis with the help of Python.  While developing this script, comments were utilized to annotate each section of code.  The following steps were taken: 

  • importing system modules
  • importing Python modules: os, time, datetime, shutil, ArcGIS extensions
  • creating variables
  • creating a list of rasters
  • setting up a FOR IN loop
  • formatting raster names
  • projecting and clipping a list of rasters
  • creating a hillshade and calculating slope for a list of rasters
  • merging the tiles together
  • debugging and running script


Results 

The final script is displayed below.  After successfully running the script, the results were viewed in ArcMap. 

#-------------------------------------------------------------------
# Name:         Ex5
# Purpose:      Using standard raster geoprocessing tools in a FOR IN loop
#
# Author:      williaan
#
# Created:     18/07/2016

#-------------------------------------------------------------------


#import system modules
import arcpy
from arcpy import env
import os
import time
import datetime
import shutil
from arcpy.sa import *
from arcpy.ddd import *
arcpy.env.overwriteOutput = True

arcpy.env.workspace = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\oregon"

#setting output coordinate system to NAD_1983_HARN_Oregon_Statewide_Lambert
arcpy.env.outputCoordinateSystem = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\OregonBoundary"

#check out Spatial Analyst and 3D Extensions

arcpy.CheckOutExtension('3D')
arcpy.CheckExtension('spatial')


#create variables

print "Obtaining a list of rasters" ,datetime.datetime.now().strftime("%H:%M:%S")

workspace = r"Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\oregon"
geodatabasePath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb"


#set boundary for clipping all rasters (Oregon State boundary)

clipBoundary = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_5\Ex5\Ex5.gdb\OregonBoundary"

#list for all clipped rasters (for future mosaicing)
clipList = []

#list for hillshades
hsList = []

#list for slopes

slopeList = []

#names for mosaic operations

mosaicedDEM = "NW_Oregon_Elevation_30m"
mosaicedHS = "NW_Oregon_Hillshade_30m"
mosaicedSlope = "NW_Oregon_Slope_30m"

print "Obtaining a list of rasters"
rasters = arcpy.ListRasters("","IMG")

for raster in rasters:
    print "Formatting the raster's name"
    #obtaining just basename of raster (no file extension or path)
    outName = os.path.basename(raster).rstrip(os.path.splitext(raster)[1])
    print "Unformatted name:" + outName
    outName = str(outName)
    #using string 'slice' to remove first three characters from filename
    fileName = str(outName[3:])
    print "Formatted name:" + fileName
    #set all new raster names
    rasterProj = os.path.join(geodatabasePath,fileName) + "_Proj"
    rasterClip = os.path.join(geodatabasePath,fileName) + "_CLip"
    rasterHS = os.path.join(geodatabasePath,fileName) + "_HS"
    rasterSlope = os.path.join(geodatabasePath,fileName) + "_Slope"
    #print rasterProj to make sure name formatting is working
    print rasterProj
    #project raster
    print "Projecting" + fileName ,datetime.datetime.now().strftime("%H:%M:%S")
    arcpy.ProjectRaster_management(raster,rasterProj,"","BILINEAR")
    #clip projected raster to state boundary
    print "Clipping" + fileName
   arcpy.Clip_management(rasterProj,"",rasterClip,clipBoundary,"","ClippingGeometry","NO_MAINTAIN_EXTENT")
    clipList.append(rasterClip)
    #run hillshade on clipped rasters
    print "Hillshading" + fileName
    arcpy.HillShade_3d(rasterClip,rasterHS)
    hsList.append(rasterHS)
    #calculate slope on list of clipped rasters
    print "Calculating slope of" + fileName
    arcpy.Slope_3d(rasterClip,rasterSlope)
    slopeList.append(rasterSlope)
    print "Loop is finished." ,datetime.datetime.now().strftime("%H:%M:%S")

#merging the tiles
print "Mosaicing the clipped DEMS"
arcpy.MosaicToNewRaster_management(clipList,geodatabasePath,mosaicedDEM,"","32_BIT_FLOAT","",1)
print "Mosaicing the hillshaded DEMS"
arcpy.MosaicToNewRaster_management(hsList,geodatabasePath,mosaicedHS,"","8_BIT_UNSIGNED","",1)
print "Mosaicing the slope rasters"
arcpy.MosaicToNewRaster_management(slopeList,geodatabasePath,mosaicedSlope,"","32_BIT_FLOAT","",1)


print "Script is complete." ,datetime.datetime.now().strftime("%H:%M:%S")

Exercise 4: Working with Fields & Selections

Goals and Objectives

The purpose of this exercise was to gain experience working with adding and calculating fields as well as using SQL statements within PyScripter. 


Methods 

This exercise carried out the exact same processes as those used in Exercise 1, only this time using Python instead of Model Builder.  While developing the script, comments were utilized to annotate each section of code.  The following steps were taken: 

  • importing system modules
  • importing Python modules: os, time, datetime
  • creating variables
  • processing data: adding fields, calculating fields, selecting features
  • debugging and running script


Results

The final script is displayed below. After successfully running this script, the results were viewed in ArcMap.

#-------------------------------------------------------------------
# Name:         Ex4
# Purpose:      Add field, calculate field, and apply SQL statement via Python
#
# Author:      williaan
#
# Created:     18/07/2016

#-------------------------------------------------------------------


#import system modules
import arcpy
from arcpy import env
import os
import time
import datetime
arcpy.env.overwriteOutput = True

#create variables
print "Creating Variables" ,datetime.datetime.now().strftime("%H:%M:%S")

#input variables
ex3geodatabasePath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_3\Ex3_results.gdb"

#filename for 'dissolve output' fc name
intersectName = "IntersectedFcs"
dissolveOutput = os.path.join(ex3geodatabasePath,intersectName) + "_Dissolved"

#output variables
selectName = "DissolveFC_Selected"
finalSelect = os.path.join(ex3geodatabasePath,selectName)


#process: add field
print "Adding a field to the dissolved fcs" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.AddField_management(dissolveOutput,"Area_km2", "FLOAT", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")

#process: calculate field
print "Calculating the area in km2" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.CalculateField_management(dissolveOutput, "Area_km2", "[Shape_Area] / 1000000", "VB", "")

#process: select areas greater than 2km
print "Selecting only polygons with areas greater than 2km" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Select_analysis(dissolveOutput, finalSelect, "Area_km2 > 2")

#process: add field
print "Adding a field for calculating compactness" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.AddField_management(finalSelect, "Compactness_Float", "FLOAT", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")

#process: calculate field
print "Calculating compactness" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.CalculateField_management(finalSelect, "Compactness_Float", "[Shape_Area] / ([Shape_Length] * [Shape_Length])", "VB", "")

print "Calculations are complete." ,datetime.datetime.now().strftime("%H:%M:%S")



Friday, July 15, 2016

Exercise 3: Geoprocessing with Python

Goals and Objectives

The purpose of this exercise was to provide an introduction to geoprocessing by way of recreating the Exercise 1 results using PyScripter.  This involved exporting a geoprocessing model as a script, adding several smart variables, and modifying the exported script. 


Methods

This exercise carried out the exact same processes as those used in Exercise 1, only this time using Python instead of Model Builder.  While developing the script, comments were utilized to annotate each section of code.  The following steps were taken: 

  • importing system modules
  • importing Python modules: os, time, datetime
  • creating variables
  • inputting feature class names
  • linking geodatabase path with file names
  • processing data: clipping, selecting, buffering, intersecting, dissolving*
  • debugging and running script

*To expedite carrying out these processes, the model from Exercise 1 was exported to a Python script and portions of that code were copied and updated.  

Results

The final script is displayed below.  After successfully running this script, the results were viewed in ArcMap. 

#-------------------------------------------------------------------
# Name:         Ex3
# Purpose:      Finds ideal areas for ski resorts
#
# Author:      williaan
#
# Created:     15/07/2016

#-------------------------------------------------------------------

#import system modules
import arcpy
from arcpy import env
import os
import time
import datetime
arcpy.env.overwriteOutput = True

#create variables
print "Creating Variables" ,datetime.datetime.now().strftime("%H:%M:%S")

#input geodatabase
ex1geodatabasePath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_1\Ex1.gdb"

#input names
nfsLand = "NFS_Land"
snowDepth = "SnowDepth_in"
temp = "Temperature_F"
studyArea = "StudyArea"
airports = "Airports"

#linking geodatabase with name
nfsInput = os.path.join(ex1geodatabasePath,nfsLand)
snowDepthInput = os.path.join(ex1geodatabasePath,snowDepth)
tempInput = os.path.join(ex1geodatabasePath,temp)
studyAreaInput = os.path.join(ex1geodatabasePath,studyArea)
airportsInput = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_1\Ex1.gdb\Airports_Project"

#output variables
ex3geodatabasePath = "Q:\StudentCoursework\CHupy\GEOG.491.801.2167\WILLIAAN\Ex_3\Ex3_results.gdb"

#clipped
nfsLandClip = os.path.join(ex3geodatabasePath,nfsLand) + "_Clip"
snowDepthClip = os.path.join(ex3geodatabasePath,snowDepth) + "_Clip"
tempClip = os.path.join(ex3geodatabasePath,temp) + "_Clip"
airportsClip = os.path.join(ex3geodatabasePath,airports) + "_Clip"

#selected
snowDepthSelected = os.path.join(ex3geodatabasePath,snowDepth) + "_Selected"
tempSelected = os.path.join(ex3geodatabasePath,temp) + "_Selected"
airportsSelected = os.path.join(ex3geodatabasePath,airports) + "_Selected"

#buffer output
airportsBuffered = os.path.join(ex3geodatabasePath,airports) + "_Buffered"

#intersect output
intersectName = "IntersectedFcs"
intersectOutput = os.path.join(ex3geodatabasePath,intersectName)

#dissolve output
dissolveOutput = os.path.join(ex3geodatabasePath,intersectName) + "_Dissolved"

#final select
finalSelect = os.path.join(ex3geodatabasePath,intersectName) + "_MoreThan2km2"

#begin processing 
print "Starting to Process" ,datetime.datetime.now().strftime("%H:%M:%S")


#clip all feature classes to the study area
print "Clipping fcs to within the study area" ,datetime.datetime.now().strftime("%H:%M:%S")
#clip nfs land
arcpy.Clip_analysis(nfsInput, studyAreaInput, nfsLandClip, "")

#clip temp
arcpy.Clip_analysis(tempInput, studyAreaInput, tempClip, "")

#clip snow depth
arcpy.Clip_analysis(snowDepthInput, studyAreaInput, snowDepthClip, "")

#clip airports
arcpy.Clip_analysis(airportsInput, studyAreaInput, airportsClip, "")


#select meaningful values from clipped classes
print "Selecting meaningful values from clipped classes" ,datetime.datetime.now().strftime("%H:%M:%S")
#select temp
arcpy.Select_analysis(tempClip, tempSelected, "gridcode<32")

#select snow depth
arcpy.Select_analysis(snowDepthClip, snowDepthSelected, "gridcode>240")

#select airports
arcpy.Select_analysis(airportsClip, airportsSelected, "OwnerType = 'Pu' AND hasTower = 'Y'")


#buffering selected airports
print "Buffering selected airports" ,datetime.datetime.now().strftime("%H:%M:%S")
#buffer
arcpy.Buffer_analysis(airportsSelected, airportsBuffered, "40 Miles", "FULL", "ROUND", "ALL", "", "PLANAR")


#intersecting the fcs
print "Intersecting the fcs" ,datetime.datetime.now().strftime("%H:%M:%S")
#intersect
arcpy.Intersect_analysis([snowDepthSelected,tempSelected,nfsLandClip,airportsBuffered], intersectOutput, "ALL", "", "INPUT")


#dissolving the intersected fcs
print "Dissolving the intersected fcs" ,datetime.datetime.now().strftime("%H:%M:%S")
arcpy.Dissolve_management(intersectOutput, dissolveOutput, "", "", "SINGLE_PART", "DISSOLVE_LINES")

print "Script is complete."