When we last left off we had just finished putting together our download endpoint. This time around we're going extend it to actually attempt to download a file. That means we'll also be touching on how we're going to test our download endpoint. Are you ready? Let's dive in!
Pass Four
It may look scary as we've added a bunch more imports. With io, net/url, os, and path/filepath appearing for the first time. Keep an eye out for them to be used later on, you'll see they aren't scary at all. Now, let's jump most of the way down our code to the bottom of the handleDownRequest
function.
package main
import (
"encoding/json"
"fmt"
"io"
"io/ioutil"
"log"
"net/http"
"net/url"
"os"
"path/filepath"
)
type download struct {
Title string `json:"title"`
Location string `json:"location"`
}
func status(response http.ResponseWriter, request *http.Request) {
fmt.Fprintf(response, "Hello!")
}
func handleDownloadRequest(response http.ResponseWriter, request *http.Request) {
var downloadRequest download
r, err := ioutil.ReadAll(request.Body)
if err != nil {
http.Error(response, "bad request", 400)
log.Println(err)
return
}
defer request.Body.Close()
err = json.Unmarshal(r, &downloadRequest)
if err != nil {
http.Error(response, "bad request: "+err.Error(), 400)
return
}
log.Printf("%#v", downloadRequest)
Everything should look pretty familiar from our last revision. Besides the new imports, nothing else has changed so far. At this point in the code, we have our struct populated with the title and URL for downloading. Let's pass that to a new function getFile
. getFile
will return an error or nil
, if successful. If something happens we'll simply return a 500 "internal server" error back to the browser. We'll log something a bit more specific to the internal log though. We want to avoid leaking server information back to the browser whenever possible for security purposes. I'm not really too concerned about it for this project but it's a good thing to keep in mind.
err = getFile(downloadRequest)
if err != nil {
http.Error(response, "internal server error", 500)
return
}
fmt.Fprintf(response, "Download!")
}
At the moment getFile isn't a very large function. As noted above we're passing in our download
struct and returning an error (or nil
). Since we're going to be saving the file to disk I'm using url.Parse
from the net/url package to parse the URL into its component parts.
func getFile(downloadRequest download) error {
parsedUrl, err := url.Parse(downloadRequest.Location)
if err != nil {
log.Println(err)
return err
}
Finally, the heart of our project! http.Get
takes our URL and requests the remote resource. However, as written this could be a problem - we're assuming that whatever URL retrieved from the JSON is OK. At the very least it's "well-formed" or it should not have made it past the url.Parse
function. Later on, we'll likely need to inspect it much closer to try and determine if we should even attempt the GET request.
response, err := http.Get(downloadRequest.Location)
if err != nil {
log.Println(err)
return err
}
defer response.Body.Close()
Let's assume everything went as it should and we have retrieved the remote resource. We're going to used parsedUrl.Path
and filepath.Base
to pull out what should be the proper filename. Again - we might need to extend this later, what if we get a URL that doesn't have a proper filename in it? Since we control the testing environment we should be OK for now though.
out, err := os.Create(filepath.Base(parsedUrl.Path))
defer out.Close()
_, err = io.Copy(out, response.Body)
if err != nil {
log.Println(err)
return err
}
return nil
}
func main() {
log.Println("Downloader")
http.HandleFunc("/", status)
http.HandleFunc("/download", handleDownloadRequest)
http.ListenAndServe(":3000", nil)
}
Testing
To make our testing easy we're going to use the fantastic Postman. If you are going to be doing any work with API's I recommend adding it to your toolkit, it's indispensable. I'm not going to go too deep into using Postman itself though, that is left as an exercise for the reader.
We'll use the following JSON object for this round of testing.
{
"title": "Attempting Go",
"location": "https://shindakun.glitch.me/content/images/2018/01/attemptinggo-2.png"
}
Upon running the current version of the code we should see something like:
$ go run downloader.go
2018/01/11 11:06:24 Downloader
Now, if we submit our PUT request with Postman we should see:
2018/01/11 11:07:32 main.download{Title:"Attempting Go", Location:"https://shindakun.glitch.me/content/images/2018/01/attemptinggo-2.png"}
Along with a newly created PNG in the same directory.
-rw-r--r-- 1 steve 197609 74860 Jan 11 11:07 attemptinggo-2.png
Awesome! Tune in tomorrow when we clean up a bit and add the ability to save files into a subdirectory. After that, I think we'll look into doing our initial deployment!
Until tomorrow next time...
You can find the code for this and most of the other Attempting to Learn Go posts in the repo on GitHub.
shindakun / atlg
Source repo for the "Attempting to Learn Go" posts I've been putting up over on dev.to
Attempting to Learn Go
Here you can find the code I've been writing for my Attempting to Learn Go posts that I've been writing and posting over on Dev.to.
Post Index
Enjoy this post? |
---|
How about buying me a coffee? |
Top comments (2)
Received this output in my png file:
<html>
<head><title>504 Gateway Time-out</title></head>
<body bgcolor="white">
<center><h1>504 Gateway Time-out</h1></center>
</body>
</html>
though postman gave 200 ok status.
SlideShare Downloader is a powerful tool that allows users to easily download presentations, documents, and slideshows from slidesharedownloading.com/. With this convenient utility, users can access and save valuable content for offline viewing or sharing with others. By simply entering the SlideShare URL, the downloader extracts and downloads the file in various formats, ensuring compatibility and ease of use. Whether it's for educational purposes, professional research, or personal use, SlideShare Downloader empowers users to effortlessly obtain and preserve SlideShare content for their convenience and future reference.