7 Dec 2014

QspatiaLite Quicktip: Convert MULTILINESTRING to LINESTRING

One often encounters the problem, that after digitizing or running processing algorithms, the output geometry is MULTILINESTRING, but we rather wished to have the geometrytype LINESTRING. Until know I used a quite cumbersome, multistep workflow for conversion between these geometry-types - however, as we will see, all of this becomes ridicously easy with spatial SQL:

select 
   replace(replace(replace(replace(replace(replace(astext(Collect(t.geometry)), 'MULTILINESTRING((','§'), '))', '%'), '(', ''), ')', ''), '§', 'LINESTRING('), '%', ')'
) as geom
from (
    select MultiLinestringFromText('MULTILINESTRING((-1 -1, 0 0), (1 1, 4 4))') as geometry
) as t

resulting in:
LINESTRING(-1 -1, 0 0, 1 1, 4 4)

However, if your orginal line was something like MULTILINESTRING((-1 -1, 0 0), (0 0, 4 4))
you'd end up with:

LINESTRING(-1 -1, 0 0, 0 0, 4 4)

which contains double vertices, which we certainly don't want!

So be aware, that the ordering / direction of the linestring will be as in the segments of the original layer! And as we saw, gaps between subsequent end-/startnodes will be closed in the new geometry!! It is adviseable to doublecheck before / after conversion!

If you deal with a multilinestring (or a combination of any type of linesstrings) which share end/startnodes nodes things are even easieruse this SQL:

SELECT AsText(Linemerge(MultiLinestringFromText('MULTILINESTRING((-1 -1, 0 0), (0 0, 4 4))')))

resulting in:
LINESTRING(-1 -1, 0 0, 4 4)

6 Dec 2014

QspatiaLite Use Case: Query for Species Richness within Search-Radius

Following up my previous blogpost on using SpatiaLite for the calculation of diversity metrics from spatial data, I'll add this SQL-query which counts unique species-names from the intersection of species polygons and a circle-buffer around centroids of an input grid. The species number within the bufferarea are joined to a newly created grid. I use a subquery which grabs only those cells from the rectangular input grid, for which the condition that the buffer-area around the grid-cell's centroid covers the species unioned polygons at least to 80%.



  • Example data is HERE. You can use the shipped qml-stylefile for the newly generated grid. It labels three grid-cells with the species counts for illustration.

  • Import grid- and Sp_distr-layers with QspatiaLite Plugin

  • Run query and choose option "Create spatial table and load in QGIS", mind to set "geom" as geometry column

    select 
        g1.PKUID as gID,
        count (distinct s.species) as sp_num_inbu, 
        g1.Geometry AS geom
    from (
     select g.*
     from(select Gunion(geometry) as geom
               from Sp_distr) as u, grid as g
     where area(intersection(buffer(centroid(g.geometry), 500), u.geom)) > pow(500, 2)*pi()*0.8
    ) as g1 join Sp_distr as s on intersects(buffer(centroid( g1.Geometry), 500), s.Geometry)
    group by gID
    

  • 5 Dec 2014

    QspatiaLite Use Case: Get Subselection of Grid which Covers Polygon

    Here's another short SQL-query which I used to get a subselect from a rectengular grid. Aim is to keep only the grid-cells that fully cover the area of a second polygon-layer - cells which do not overlap the polygon's area completely will be skipped from the new grid-layer.

    select 
      g.*
    from(select Gunion(geometry) as geom
               from MYPLGN) as u, grid as g
    where area(intersection(g.geometry, u.geom)) = area(g.geometry)
    

    2 Dec 2014

    QspatiaLite Use Case: Connect Points with Same ID with Line Using the QspatiaLite Plugin

    Another short example illustrating the effectiveness of geoprocessing with SpatiaLite, using the great QGIS-plugin QspatialLite.

  • We have a point-layer with an ID column ("Birds"), with each ID occuring twice, each ID representing an individual. The Ids should be used as start- & end-nodes for the connecting lines. Note that this also would apply if there were more than two points - then the same query could be used to connect all bird individual's points to a line by the order in each group!

  • We want each set of points, grouped by ID, to be connected. This is easily achieved by importing the points to a SpatiaLite-DB with the QspatiaLite plugin and running a very simple query:

    SELECT 
        ID,
        makeline(Geometry) AS geom
    FROM Birds
    GROUP BY ID
    

  • Load the result to QGIS and that's it!

  • 1 Dec 2014

    QspatiaLite Use Case: Find Dominant Species and Species Count within Sampling Areas Using the QspatiaLite Plugin

    This blogpost shows how to find the dominant species and species counts within sampling polygons. The Species-layer that I'll use here is comprised of overlapping polygons which represent the distribution of several species. The Regions-layer represents areas of interest over which we would like to calculate some measures like species count, dominant species and area occupied by the dominant species.

    Since QGIS now makes import/export and querying of spatial data easy, we can use the spatiaLite engine to join the intersection of both layers to the region table and then aggregate this intersections by applying max- and count-function on each region. We'll also keep the identity and the area-value of the species with the largest intersecting area.

    For the presented example I'll use
  • Regions, which is a polygon layer with a areas of interest
  • Species, which is a polygon layer with overlapping features, representing species

    Do the calculation in 2 easy steps:
  • Import the layers to a spatiaLite DB with the Import function of the plugin (example data: HERE)
  • Run the query. For later use you can load this table to QGIS or export with the plugin's export button.

    SELECT   
      t.region AS region,
      t.species AS sp_dom,
      count(*) AS sp_number,
      max(t.sp_area) / 10000 AS sp_dom_area
      FROM ( 
          SELECT
              g.region AS region, s.species AS species,
              area(intersection(g.Geometry, s.Geometry)) AS sp_area
              FROM Regions AS g JOIN Sp_Distribution AS s 
              ON INTERSECTS(g.Geometry, s.Geometry)  
      ) AS t
    GROUP BY t.region
    ORDER BY t.region
    

    Addendum:
    If you wish to calculate any other diversity measures, like Diversity- or Heterogenity-Indices, you might just run the below query (which actually is the subquery from above) and feed the resulting table to any statistic-software!

    The output table will contain region's IDs, each intersecting species and the intersection area.
    The intersection area, which is the species' area per polygon, is the metric that would be used for the calculation of diversity / heterogenity measures, etc. of regions!

    SELECT 
      g.region AS regID, 
      s.species AS sp,
      AREA(INTERSECTION(g.geometry, s.geometry)) AS sp_area
    FROM Regions AS g JOIN Sp_Distribution AS s 
    ON INTERSECTS(g.Geometry,s.Geometry)
    ORDER BY regID, sp_area ASC
    


    I tested this on
  • QGIS 2.6 Brighton
  • with the QspatiaLite Plugin installed
  • QspatiaLite Use Case: SpatiaLite Aggregation over Points within Polygons using the QspatiaLite Plugin

    Here's a nice example for aggregation of points per polygon areas, which I grabbed from an Answer on SO, by user @Micha. The polygons could be regions of interest, a sampling grid, etc.
    Say you want to do maximum, minimum, averages, etc. per polygon using the spatial database SpatiaLite.


  • You'll first need to import both of your layers into a spatialite DB, called "sensors" (the point layer) here, with a "pollution" column and "SHAPE1" (the polygons) with a "plgnID" column. You can do this easily with the QspatiaLite-plugin "Import" button (example data is HERE).

  • Now this query will give you various statistics from the sensors for each polygon:

    SELECT g.plgnID AS "plgn_ID",
       AVG(s.pollution) AS "Average Pollution", 
       MAX(s.pollution) AS "Maximum Pollution",
       COUNT(*) AS "Number of Sensors"
    FROM sensors AS s JOIN SHAPE1 AS g 
    ON contains(g.geometry, s.geometry)
    GROUP BY g.plgnID
    

  • 29 Nov 2014

    QspatiaLite Use Case: SpatiaLite Aggregation over Intersections of Polygons with QspatiaLite Plugin

    This applies to several usecases: Imagine you have a grid or polygon-layer of sampling areas and want to know the dominant feature of another polygon layer under each grid cell / sampling polygon - this could be soiltypes, landuse classes, etc. Other than the dominant feature you might be interested in the diversity of features (i.e. number of soils, etc.) per grid cell / sampling area.

    QGIS alone does not provide handy tools for aggregation of features of one layer combined with other layers, but the spatiaLite engine is tailored for this! Since QGIS now makes import/export and querying of spatial data easy, it seems very worthy to dive into spatiaLite and utilize its powerful tools!


    For the presented example I'll use:
  • SHAPE1, which is a polygon layer with a sampling grid/areas
  • Soils, which is a polygon layer with soiltypes

    I tested this on
  • QGIS 2.6 Brighton
  • with the QspatiaLite Plugin installed


  • Import the above layers to a spatiaLite DB with the Import function of the plugin (example data: HERE)


  • Run the query and choose "create spatial table and load in QGIS" and put geom as geometry column! (I chose SHAPE2 as name for the newly created layer..)


    SELECT t.geom AS geom, 
        t.plgnID AS plgnID, 
        t.soiltype AS soiltype, 
        max(t.soil_area) AS MaxArea, count () AS n_soiltypes
           FROM (SELECT 
              g.Geometry AS geom, g.plgnID AS plgnID, s.Soiltype AS soiltype,
              AREA(INTERSECTION(g.geometryO, s.geometry)) AS soil_area
              FROM SHAPE1 AS g JOIN Soils AS s 
              ON INTERSECTS(g.Geometry,s.Geometry)
           ) AS t
    GROUP BY t.plgnID
    ORDER BY t.plgnID
    


  • That's it!
  • 4 Nov 2014

    VBA Spreadsheet Function for Substring Inbetween Strings

    Function Substring2(theString As String, str1 As String, repstr1 As Integer, Optional str2 As Variant, Optional repStr2 As Variant) As String
    
        '****************************************************************************************************************
        'author:    kay cichini
        'date:      04112014
        'purpose:   find substring deligned by the x-th repition of one string at the left side
        '           and anothers string x-th repition at the right side
        'str1:      first string to be matched
        'str2:      second string to be matched, optional
        'repstr1:   nth repition of str1 to be matched
        'repstr2:   nth repition of str2 to be matched, optional
        '           with optional arguments ommited function will return substring ending with the last character of the
        '           searchstring
        '----------------------------------------------------------------------------------------------------------------
        'example:   Substring2("1234 678 101214 xxxx"; " "; 2; "x"; 3)
        '           will match position 10 after the second repition of str1, find position 20 after the third "x"
        '           then apply a mid-function with signature 'mid(string, start, length)',
        '           where the position 10 is the start and length is position 20 - len("x") - 10 = 9
        '           and the result is "101214 xx"
        '****************************************************************************************************************
        
        Dim start1, start2, lenStr1, lenStr2, length As Integer
        
        If IsMissing(str2) And IsMissing(repStr2) Then
        
            'case when last char in string should be matched
            '-----------------------------------------------
            
            start1 = 1
            lenStr1 = Len(str1)
            
            If InStr(start1, theString, str1) = 0 Then
                '0 -> String couldn't be matched!
                Exit Function
            End If
            
            For i = 0 To repstr1 - 1
                start1 = InStr(start1, theString, str1) + lenStr1
            Next i
            
            length = Len(theString) - start1 + 1
            Substring2 = Mid(theString, start1, length)
    
        Else
        
            'other cases
            '-----------
            start1 = 1
            lenStr1 = Len(str1)
            start2 = 1
            lenStr2 = Len(str2)
            
            If InStr(start1, theString, str1) = 0 Or InStr(start2, theString, str2) = 0 Then
                '0 -> String couldn't be matched!
                Exit Function
            End If
            
            For i = 0 To repstr1 - 1
                start1 = InStr(start1, theString, str1) + lenStr1
            Next i
            
            For i = 0 To repStr2 - 1
                start2 = InStr(start2, theString, str2) + lenStr2
            Next i
    
            length = start2 - lenStr2 - start1
            Substring2 = Mid(theString, start1, length)
            
        End If
        
    End Function
    

    26 Sept 2014

    Make a KML-File from an OpenStreetMap Trail

    Ever wished to use a trail on OSM on your GPS or smartphone? With this neat little R-Script this can easily be done. You'll just need to search OpenStreetMap for the ID of the trail (way), put this as argument to osmar::get_osm, convert to KML and you're good to go!




    # get OSM data
    library(osmar)
    library(maptools)
      
    rotewandsteig <- get_osm(way(166274005), full = T)
    sp_rotewandsteig <- as_sp(rotewandsteig, what = "lines")
      
    # convert to KML 
    kmlLine(sp_rotewandsteig@lines[[1]], kmlfile = "rotewandsteig.kml",
            lwd = 3, col = "blue", name = "Rotewandsteig") 
    
    # view it
    shell.exec("rotewandsteig.kml")
    

    14 Jul 2014

    Custom Feature Edit Forms in QGIS with Auto-Filling Using PyQt

    For anyone interested in the capabilities of customized feature edit forms in QGIS I'd like to reference the following GIS-Stackexchange posting: http://gis.stackexchange.com/questions/107090/auto-populate-field-as-concatenation-of-other-fields

    Quick GIS-Tip: Permanentely Sort Attribute Table by Attribute

    Here's a short shell-script (I have C:\OSGeo4W64\bin\ on my PATH) for sorting GIS data, a sqlite-db in this case, and saving it in the newly created order to a file. The DB is sorted in ascending order by the attribute 'Name' and written to a file with the KML Driver.

    cd C:/Users/Kay/Documents/Web/Openlayers/Docs/Digitizing
    ogr2ogr -sql "SELECT * FROM 'trails-db' ORDER BY Name ASC" -f "KML" trails.kml trails.sqlite
    

    4 May 2014

    R GIS: Function to Reverse KML Paths

    This is a function I wrote up for reversing KML-paths. The paths within a KML can be partially matched by their name-tags

    ## name:          ReverseKmlPath   
    ## use:           Reverse KML-pathsby matching their Name tags
    ## arguments:     PATH_TO_DOC, the path to the KML-file
    ##                NAME, the value of the name tag, function uses partial matching!
    ##                'Trail_xyz' will be matched by 'rail'
    ## requirements:  KML-structure with Placemarks containing a  and a  tag
    ## author:        Kay Cichini
    ## date:          01-05-2014
    ## license:       CC-BY-NC-SA
    
    ReverseKmlPath <- function(PATH_TO_DOC, NAMES) {
        
        require(XML)
    
        doc <- xmlInternalTreeParse(PATH_TO_DOC)
        
        if (xmlNamespaceDefinitions(doc)[[1]]$uri == "http://www.opengis.net/kml/2.2") {
            namespaces <- c(kml = "http://www.opengis.net/kml/2.2")
            flag <- 1
        } else {
            if (xmlNamespaceDefinitions(doc)[[1]]$uri == "http://earth.google.com/kml/2.0") { 
                    namespaces <- c(kml0 = "http://earth.google.com/kml/2.0")
                    flag <- 0
                } else {
                    stop ("Stopped!: Check namespace issue..")
                }
        }
            
        
        for (NAME in NAMES) {
            
            if (flag) { 
                  query <- paste0("//kml:Placemark[contains(kml:name,'", sprintf("%s", NAME), "'", ")]//kml:coordinates")
              } else {
                  query <- paste0("//kml0:Placemark[contains(kml0:name,'", sprintf("%s", NAME), "'", ")]//kml0:coordinates")
              }
    
            coords <- tryCatch(getNodeSet(doc, query, namespaces), 
                               error = function(e) message(paste("\nError: *", NAME, "* was NOT successfully matched\n")))
            
            for (i in length(coords)) {
    
                #grab coordinates from node and reverse order
                rev_coord_vector <- rev(unlist(strsplit(gsub("\\t|\\n", "", xmlValue(coords[[i]])), "\\s")))
                rev_coord_string <- paste(rev_coord_vector, collapse = " ")
    
                # re-insert reversed line-string:
                xmlValue(coords[[i]]) <- rev_coord_string
    
                # message
                if (flag) { 
                      query <- paste0("//kml:Placemark[contains(kml:name,'", sprintf("%s", NAME), "'", ")]//kml:name")
                  } else {
                      query <- paste0("//kml0:Placemark[contains(kml0:name,'", sprintf("%s", NAME), "'", ")]//kml0:name")
                }
                match <- xmlValue(getNodeSet(doc, query, namespaces)[[i]])
                message(paste0("matched name: ", match, "\n..."))
    
            }
        }
    
        # save:
        message("Reversed paths saved to:")
        saveXML(doc, paste0(dirname(PATH_TO_DOC), "/reversed_", basename(PATH_TO_DOC)),
                prefix = newXMLCommentNode("This file was created with the R-package XML::saveXML, see: "))
    }
    
    ## not run: 
    tf <- tempfile(fileext = ".kml")
    download.file("http://dev.openlayers.org/releases/OpenLayers-2.13.1/examples/kml/lines.kml", tf, mode = "wb")
    ReverseKmlPath( PATH_TO_DOC = tf, NAMES = c("Absolute", "Relative") )
    
    shell.exec(tf)
    shell.exec(paste0(dirname(tf), "/reversed_", basename(tf)))
    

    3 May 2014

    R GIS: Generalizer for KML Paths

    I'm posting a recent project's spin-off, which is a custom line-generalizer which I used for huge KML-paths. Anyone with a less clumpsy approach?

    ## line generalizing function: takes two vectors of with x/ycoords 
    ## and return ids of x/y elements which distance to its next element
    ## is shorter than the average distance between consecutive vertices
    ## multiplied by 'fac'
    check_dist <- function(x, y, fac) {
        dm <- as.matrix(dist(cbind(x, y)))
        
        ## supradiagonal holds distance from 1st to 2nd, 2nd to 3rd, etc. element
        d <- diag(dm[-1, -ncol(dm)])
        mean_dist <- mean(d)
        keep <- logical()
        
        ## allways keep first..
        keep[1] <- T
        for (i in 1:(length(x) - 2)) {
            keep[i + 1] <- (d[i] > mean_dist * fac)
            message(paste0("Distance from item ", i, " to item ", i + 1, " is: ", d[i]))
        }
        message(paste0("Treshold is: ", mean_dist * fac))
        cat("--\n")
        ## .. and always keep last
        keep[length(x)] <- T
        return(keep)
    }
    
    ## Testing function check_dist:
    x <- rnorm(5)
    y <- rnorm(5)
    (keep <- check_dist(x, y, 1.2))
    
    plot(x, y)
    lines(x[keep], y[keep], lwd = 4, col = "green")
    lines(x, y, lwd = 1, col = "red")
    text(x, y + 0.1, labels = c(1:length(x)))
    
    
    ## exclude vertices by generalization rule. coordinate-nodes with low number of vertices, 
    ## segments with less than 'min_for_gen' vertices will not be simplified, in any case coordinates will be
    ## rounded to 5-th decimal place
    
    generalize_kml_contour_node <- function(node, min_for_gen, fac) {
        
        require(XML)
        
        LineString <- xmlValue(node, trim = T)
        
        LineStrSplit <- strsplit(unlist(strsplit(LineString, "\\s")), ",")
        
        # filter out empty LineStrings which result from strsplit on '\\s'
        LineStrSplit <- LineStrSplit[sapply(LineStrSplit, length) > 0]
        
        # all 3 values are required, in case of error see for missing z-values:
        x <- round(as.numeric(sapply(LineStrSplit, "[[", 1, simplify = T)), 5)
        y <- round(as.numeric(sapply(LineStrSplit, "[[", 2, simplify = T)), 5)
        z <- round(as.numeric(sapply(LineStrSplit, "[[", 3, simplify = T)), 5)
        
        # for lines longer than 'min_for_gen' vertices, generalize LineStrings
        if (length(x) >= min_for_gen) {
            keep <- check_dist(x, y, fac)
            x <- x[keep]
            y <- y[keep]
            z <- z[keep]
            xmlValue(node) <- paste(paste(x, y, z, sep = ","), collapse = " ")
            
            # for all other cases, insert rounded values
        } else {
            xmlValue(node) <- paste(paste(x, y, z, sep = ","), collapse = " ")
        }
    }
    
    ## mind to use the appropiate namespace definition: alternatively use: 
    ## c(kml ='http://opengis.net/kml/2.2')
    kml_generalize <- function(kml_file, min_for_gen, fac) {
        doc <- xmlInternalTreeParse(kml_file)
        nodes <- getNodeSet(doc, "//kml:LineString//kml:coordinates", c(kml = "http://earth.google.com/kml/2.0"))
        mapply(generalize_kml_contour_node, nodes, min_for_gen, fac)
        saveXML(doc, paste0(dirname(kml_file), "/simpl_", basename(kml_file)))
    }
    
    ## get KML-files and generalize them
    kml_file <- tempfile(fileext = ".kml")
    download.file("http://dev.openlayers.org/releases/OpenLayers-2.13.1/examples/kml/lines.kml", 
                  kml_file, mode = "wb")
    kml_generalize(kml_file, 5, 0.9)
    shell.exec(kml_file)
    shell.exec(paste0(dirname(kml_file), "/simpl_", basename(kml_file)))

    17 Mar 2014

    Download all Documents from Google Drive with R

    A commentator on my blog recently asked if it is possible to retrieve all direct links to your Google Documents. And indeed it can be very easily done with R, just like so:









    # you'll need RGoogleDocs (with RCurl dependency..)
    install.packages("RGoogleDocs", repos = "http://www.omegahat.org/R", type="source")
    library(RGoogleDocs)
    
    
    
    gpasswd = "mysecretpassword"
    auth = getGoogleAuth("kay.cichini@gmail.com", gpasswd)
    con = getGoogleDocsConnection(auth)
    
    CAINFO = paste(system.file(package="RCurl"), "/CurlSSL/ca-bundle.crt", sep = "")
    docs <- getDocs(con, cainfo = CAINFO)
    
    # get file references
    hrefs <- lapply(docs, function(x) return(x@access["href"]))
    keys <- sub(".*/full/.*%3A(.*)", "\\1", hrefs)
    types <- sub(".*/full/(.*)%3A.*", "\\1", hrefs)
    
    # make urls (for url-scheme see: http://techathlon.com/download-shared-files-google-drive/)
    # put format parameter for other output formats!
    pdf_urls <- paste0("https://docs.google.com/uc?export=download&id=", keys)
    doc_urls <- paste0("https://docs.google.com/document/d/", keys, "/export?format=", "txt")
    
    # download documents with your browser
    gdoc_ids <- grep("document", types)
    lapply(gdoc_ids, function(x) shell.exec(doc_urls[x]))
    
    pdf_ids <- grep("pdf", types, ignore.case = T)
    lapply(pdf_ids, function(x) shell.exec(pdf_urls[x]))
    

    3 Mar 2014

    Use Case: Make Contour Lines for Google Earth with Spatial R

    Here's comes a script I wrote for creating contour lines in KML-format to be used with Google Earth http://github.com/gimoya/theBioBucket-Archives/blob/master/R/contours_for_google_earth.R

    If you want to check or just use the datasets I created for the Alps region, you can download it here: http://terrain-overlays.blogspot.co.at/index.html

    1 Mar 2014

    Use GDAL from R Console to Split Raster into Tiles

    When working with raster datasets I often encounter performance issues caused by the large filesizes. I thus wrote up a little R function that invokes gdal_translate which would split the raster into parts which makes subsequent processing more CPU friendly. I didn't use built-in R functions simply because performance is much better when using gdal from the command line..

    The screenshot to the left shows a raster in QGIS that was split into four parts with the below script.



    ## get filesnames (assuming the datasets were downloaded already. 
    ## please see http://thebiobucket.blogspot.co.at/2013/06/use-r-to-bulk-download-digital.html 
    ## on how to download high-resolution DEMs)
    setwd("D:/GIS_DataBase/DEM")
    files <- dir(pattern = ".hgt")
    
    ## function for single file processing mind to replace the PATH to gdalinfo.exe!
    ## s = division applied to each side of raster, i.e. s = 2 gives 4 tiles, 3 gives 9, etc.
    split_raster <- function(file, s = 2) {
        
        filename <- gsub(".hgt", "", file)
        gdalinfo_str <- paste0("\"C:/OSGeo4W64/bin/gdalinfo.exe\" ", file)
          
        # pick size of each side
        x <- as.numeric(gsub("[^0-9]", "", unlist(strsplit(system(gdalinfo_str, intern = T)[3], ", "))))[1]
        y <- as.numeric(gsub("[^0-9]", "", unlist(strsplit(system(gdalinfo_str, intern = T)[3], ", "))))[2]
        
        # t is nr. of iterations per side
        t <- s - 1
        for (i in 0:t) {
            for (j in 0:t) {
                # [-srcwin xoff yoff xsize ysize] src_dataset dst_dataset
                srcwin_str <- paste("-srcwin ", i * x/s, j * y/s, x/s, y/s)
                gdal_str <- paste0("\"C:/OSGeo4W64/bin/gdal_translate.exe\" ", srcwin_str, " ", "\"", file, "\" ", "\"", filename, "_", i, "_", j, ".tif\"")
                system(gdal_str)
            }
        }
    }
    
    ## process all files and save to same directory
    mapply(split_raster, files, 2) 
    

    5 Feb 2014

    Use Case: Spatial R & Google Earth for Terrain Analyses

    I'd like to share code that uses spatial R and Google Earth for terrain analyses. In this example I took SRTM data at 1" resolution from http://www.viewfinderpanoramas.org/dem3.html#alps read it into R did a little processing and finally wrapped it up in a KML-file that I use as ground-overlay in Google Earth. In fact I eventually converted it into a sqlitedb with MAPC2MAPC for usage on a mobile device.

    See the code here on github..

    20 Jan 2014

    Get No. of Google Search Hits with R and XML

    UPDATE: Thanks to Max Ghenis for updating my R-script which I wrote a while back - the below R-script can now be used again for pulling the number of hits from Google-Search.

    GoogleHits <- function(input)
       {
        require(XML)
        require(RCurl)
        url <- paste("https://www.google.com/search?q=\"",
                     input, "\"", sep = "")
     
        CAINFO = paste(system.file(package="RCurl"), "/CurlSSL/ca-bundle.crt", sep = "")
        script <- getURL(url, followlocation = TRUE, cainfo = CAINFO)
        doc <- htmlParse(script)
        res <- xpathSApply(doc, '//*/div[@id="resultStats"]', xmlValue)
        cat(paste("\nYour Search URL:\n", url, "\n", sep = ""))
        cat("\nNo. of Hits:\n")
        return(as.integer(gsub("[^0-9]", "", res)))
       }
     
    # Example:
    GoogleHits("R%Statistical%Software")
    

    p.s.: If you try to do this in a robot fashion, like:
    lapply(list_of_search_terms, GoogleHits)
    

    google will block you after about the 300th recursion!