DEV Community

Cover image for Hacking Phoenix LiveUpload
Andrés Alejos
Andrés Alejos

Posted on • Originally published at thestackcanary.com on

Hacking Phoenix LiveUpload

The Phoenix Framework is a simply spectacular Server-Side-Rendered (SSR) web-development framework written in Elixir. Phoenix supports reactive functionality through its web-socket based library called LiveView, which makes writing dynamic SSR web-apps a breeze. One of the built-in features of LiveView is its support of Uploads out of the box. Some of the features, as described in the documentation, are:

  • Accept specification - Define accepted file types, max number of entries, max file size, etc. When the client selects file(s), the file metadata is automatically validated against the specification. See Phoenix.LiveView.allow_upload/3.
  • Reactive entries - Uploads are populated in an @uploads assign in the socket. Entries automatically respond to progress, errors, cancellation, etc.
  • Drag and drop - Use the phx-drop-target attribute to enable. See Phoenix.Component.live_file_input/1.

LiveView even includes an out-of-the-box solution for rendering previews of your uploads using the live_img_preview/1 component, but notably, these previews only work for image uploads by default since it simply adds the file data as a Blob to an img tag.

In this article, I will walk you through how the Phoenix Live Image Preview works, and show you how to customize it so that you can add a custom rendering solution to render previews for other types of files – in particular I will be showing how to render a PDF preview using Mozilla's pdf.js library.

Demo

The code for this project can be found here at my GitHub. Feel free to give me a follow on there to see what else I'm up to.

How Do Phoenix Uploads Work?

The Phoenix documentation is pretty thorough on describing how uploads are handled, but I will give a brief synopsis here for convenience.

All uploads held in a session are stored in a reserved assigns variable @uploads . If you try to name or update the uploads assigns manually you will notice that you get an error since Phoenix reserves that particular key in the assigns. You enable uploads for a particular page using Phoenix.LiveView.allow_upload/3, which accepts a specification that tells it information such as the allowed file type, max file entries, max file size, and more. The first parameter to allow_upload is a key that will specify a specific upload within the uploads assign, meaning you can have multiple upload "sessions" at once.

One of the keys you may optionally specify is the :writer key, which points to a module implementing the UploadWriter behaviour which specifies what to do with the file chunks from the uploaded files before consuming the file uploads. If you do not specify this option, then the default UploadTmpFileWriter will be used.

You then use the Phoenix.Component.live_file_input/1 component to specify the file input. As files are uploaded using the live file input component, data about the

The writer is the mediator between when the file is uploaded into the live_file_input and when it is subsequently consumed using Phoenix.LiveView.consume_uploaded_entries/3. The implementation of the writer dictates how you consume the uploads according to the meta/1 implementation in the writer. For example, here is an example of how the default writer works along with the default example of how files are consumed:

defmodule Phoenix.LiveView.UploadTmpFileWriter do
  @moduledoc false

  @behaviour Phoenix.LiveView.UploadWriter

  # Other impls ommitted for brevity

  @impl true
  def meta(state) do
    %{path: state.path}
  end
end

defmodule Test.DemoLive do
  def handle_event("save", _params, socket) do
    uploaded_files =
      consume_uploaded_entries(socket, :avatar, fn %{path: path}, _entry ->
        dest = Path.join("priv/static/uploads", Path.basename(path))
        File.cp!(path, dest)
        {:ok, Routes.static_path(socket, "/uploads/#{Path.basename(dest)}")}
      end)
    {:noreply, update(socket, :uploaded_files, &(&1 ++ uploaded_files))}
  end
end
Enter fullscreen mode Exit fullscreen mode

Notice that the last parameter of consume_uploaded_entry specifies a function whose inputs are the outputs of the writer's meta implementation, so if you find yourself implementing your own writer, ensure that you keep this in mind. In our case, we are not going to change the writer since the modifications we are making mostly reside in the JavaScript side.

Adding Custom Preview Rendering

Now I will discuss the necessary steps to add custom preview rendering to Phoenix Live Uploads. This will assume you have set up your Phoenix project with the appropriate routing and database connection, so we will not be discussing those portions. Refer to the getting started guide if needed.

The way-too-simplified explanation for what I'm going to show is how to synchronize the PDF rendering and subsequent conversion to a Blob which then is stored as an HTML attribute in its corresponding element. A unique entry reference ID is used to identify the correct element and Blob between the server and client, but in order to generate the ID and not disrupt other upload types we must use the existing LiveUploader instance that LiveView uses by using the UploadEntry class.

To adapt this method to other file extensions, just change your rendering method and how you match on file extension.

The files in the repository that you should be concerned with are:

You will have to make 1 modification to a file that is not included in the GitHub repository. After you run mix deps.get, go to the file located at deps/phoenix_live_view/priv/static/phoenix_live_view.esm.js, and at the bottom you will see where Phoenix Live View declares its JavaScript exports. By default it only exports the LiveSocket class, but you will need to add UploadEntry as an export. This will be used within app.js.

//deps/phoenix_live_view/priv/static/phoenix_live_view.esm.js
// Bottom of the file

// Before modifications
export { 
  LiveSocket
};

// After modifications
export { 
  LiveSocket, 
  UploadEntry 
};
Enter fullscreen mode Exit fullscreen mode

Modifying index.ex

This is where all of the server-side code for the uploads reside. Let's start by enabling uploads. We need to give a key that we will use to store the uploads we want to associate with this input batch. We will use the key :demo here:

defmodule UploadPdfPreviewWeb.DemoLive.Index do
  use UploadPdfPreviewWeb, :live_view

  @impl true
  def mount(_session, _params, socket) do
    socket =
      socket
      |> assign(:uploaded_files, [])
      |> allow_upload(
        :demo,
        accept: ~w(.pdf .jpg .jpeg .png .tif .tiff),
        max_entries: 5
      )

    {:ok, socket}
  end
  ...
end
Enter fullscreen mode Exit fullscreen mode

So in this example we are accepting files with extensions .pdf, .jpg, .jpeg, .png, .tif, and .tiff. We are limiting an upload batch to at most five entries. We also add another assign separate from the @uploads assign where we will store uploaded files once they are consumed.

Most of the modification we will make are in the client-side hooks that are invoked from the live_file_input and live_img_preview components. Since the hooks are hard-coded in the components, I just looked at what the components are doing under the hood and copied the default with the only changes being the hooks. For the live_file_input, we now add a custom hook called LiveFileUpload, and for live_img_preview we conditionally include a hook LivePdfPreview only for PDF files, and otherwise we attach the default hook.

So now the custom live_file_upload looks like:

input
  id={@uploads.demo.ref}
  type="file"
  name={@uploads.demo.name}
  accept={@uploads.demo.accept}
  data-phx-hook="LiveFileUpload"
  data-phx-update="ignore"
  data-phx-upload-ref={@uploads.demo.ref}
  data-phx-active-refs={join_refs(for(entry <- @uploads.demo.entries, do: entry.ref))}
  data-phx-done-refs={
    join_refs(for(entry <- @uploads.demo.entries, entry.done?, do: entry.ref))
  }
  data-phx-preflighted-refs={
    join_refs(for(entry <- @uploads.demo.entries, entry.preflighted?, do: entry.ref))
  }
  data-phx-auto-upload={@uploads.demo.auto_upload?}
  multiple={@uploads.demo.max_entries > 1}
  class="sr-only"
/>
Enter fullscreen mode Exit fullscreen mode

And the custom live_img_preview looks like:

<img
    id={"phx-preview-#{entry.ref}"}
    data-phx-upload-ref={entry.upload_ref}
    data-phx-entry-ref={entry.ref}
    data-phx-hook={
      if entry.client_type == "application/pdf",
        do: "LivePdfPreview",
        else: "Phoenix.LiveImgPreview"
    }
    data-phx-update="ignore"
/>
Enter fullscreen mode Exit fullscreen mode

We can optionally choose to validate the entries beyond the provided validation:

  def handle_event("validate_upload", _params, socket) do
    num_remaining_uploads =
      length(socket.assigns.uploaded_files) - socket.assigns.uploads.demo.max_entries

    valid =
      Enum.uniq_by(socket.assigns.uploads.demo.entries, & &1.client_name)
      |> Enum.take(num_remaining_uploads)

    socket =
      Enum.reduce(socket.assigns.uploads.demo.entries, socket, fn entry, socket ->
        if entry in valid do
          socket
        else
          socket
          |> cancel_upload(:demo, entry.ref)
          |> put_flash(
            :error,
            "Uploaded files should be unique and cannot exceed #{socket.assigns.uploads.demo.max_entries} total files."
          )
        end
      end)

    {:noreply, socket}
  end
Enter fullscreen mode Exit fullscreen mode

Here, we filter out any duplicate files and we check to see how many uploaded files have been done in previous batches (which is stored in the uploaded_files assigns) and take as many files as we have remaining. Then we cancel any of the files which didn't pass our validation.

Lastly, we add a handler for the file submissions, potentially customizing the behavior depending on file type:

 def handle_event("submit_upload", _params, socket) do
  uploaded_files =
    consume_uploaded_entries(socket, :demo, fn %{path: _path}, entry ->
      case entry.client_type do
        "application/pdf" ->
          # Handle PDFs
          IO.puts("PDF")

        _ ->
          # Handle images
          IO.puts("Image")
      end
    end)

  socket =
    socket
    |> update(:uploaded_files, &(&1 ++ uploaded_files))

  {:noreply, socket}
end
Enter fullscreen mode Exit fullscreen mode

Modifying app.js

Next we can modify the client-side hooks, which consists of implementing the two hooks we defined in our server code. The LiveFileUpload code essentially commandeers the stock implementation of the hook from Phoenix Live View (which you can view here). It relies on the UploadEntry class we exported earlier, which communicates with the LiveUploader singleton class that stores data regarding the current uploads. The custom implementation essentially reimplements the stock version, adding in a check and new behavior for PDF file types. Let's walk through this code:

LiveFileUpload Hook

Hooks.LiveFileUpload = {
  activeRefs() {
    return this.el.getAttribute("data-phx-active-refs");
  },

  preflightedRefs() {
    return this.el.getAttribute("data-phx-preflighted-refs");
  },

  mounted() {
    this.preflightedWas = this.preflightedRefs();
    let pdfjsLib = window["pdfjs-dist/build/pdf"];
    // Ensure pdfjsLib is available globally
    if (typeof pdfjsLib === "undefined") {
      console.error("pdf.js is not loaded");
      return;
    }
    // Use the global `pdfjsLib` to access PDFJS functionalities
    pdfjsLib.GlobalWorkerOptions.workerSrc =
      "https://cdnjs.cloudflare.com/ajax/libs/pdf.js/3.11.174/pdf.worker.min.js";
    this.el.addEventListener("input", (event) => {
      const files = event.target.files;
      for (const file of files) {
        if (file.type === "application/pdf") {
          const fileReader = new FileReader();
          fileReader.onload = (e) => {
            const typedarray = new Uint8Array(e.target.result);
            // Load the PDF file
            pdfjsLib.getDocument(typedarray).promise.then((pdf) => {
              // Assuming you want to preview the first page of each PDF
              pdf.getPage(1).then((page) => {
                const scale = 1.5;
                const viewport = page.getViewport({ scale: scale });
                const canvas = document.createElement("canvas");
                const context = canvas.getContext("2d");
                canvas.height = viewport.height;
                canvas.width = viewport.width;

                // Render PDF page into canvas context
                const renderContext = {
                  canvasContext: context,
                  viewport: viewport,
                };
                page.render(renderContext).promise.then(() => {
                  // Convert canvas to image and set as source for the element
                  const imgSrc = canvas.toDataURL("image/png");
                  let upload_entry = new UploadEntry(
                    this.el,
                    file,
                    this.__view
                  );
                  if (
                    (imgEl = document.getElementById(
                      `phx-preview-${upload_entry.ref}`
                    ))
                  ) {
                    imgEl.setAttribute("src", imgSrc);
                  } else {
                    this.el.setAttribute(
                      `pdf-preview-${upload_entry.ref}`,
                      imgSrc
                    );
                  }
                });
              });
            });
          };
          fileReader.readAsArrayBuffer(file);
        }
      }
    });
  },
  updated() {
    let newPreflights = this.preflightedRefs();
    if (this.preflightedWas !== newPreflights) {
      this.preflightedWas = newPreflights;
      if (newPreflights === "") {
        this.__view.cancelSubmit(this.el.form);
      }
    }

    if (this.activeRefs() === "") {
      this.el.value = null;
    }
    this.el.dispatchEvent(new CustomEvent("phx:live-file:updated"));
  },
};
Enter fullscreen mode Exit fullscreen mode

On mount, we load PDF.js from the CloudFlare CDN (you should specify the most updated version) and we add a listener that listens on the input event for the input tag. This gets triggered whenever files are uploaded using the input, and stores all files associated with the upload batch (rather than one at a time).

So we iterate through the target (uploaded) files and we only care about PDF files, all other files will just continue. For PDF files, we use PDF.js to get the first page (you can extend this to all if you would like) for the sake of the preview. We render the PDF into a canvas which we can then convert to a PNG, encoded as a Blob object through the canvas.toDataURL function. We then create a new UploadEntry using this Blob, which registers the data to the UploadWriter, doing various things such as assigning a unique ID (ref) to the entry. Since the upload now has a unique ID, we can attach the unique ID to the DOM element which can be used later on during the LivePdfPreview hook.

LivePdfPreview Hook

We now define the LivePdfPreview hook which will only be used to preview PDF files. The code here is derivative of the default Phoenix.LiveImgPreview handler, and only works given the previous hook we discussed.

Here, we check to see if the PDF preview has already been rendered. If not, we put a placeholder image as the source, knowing that the first hook will replace the placeholder when it's done rendering. Otherwise we set the element's src to be the Blob from the previous hook.

Hooks.LivePdfPreview = {
  mounted() {
    this.ref = this.el.getAttribute("data-phx-entry-ref");
    this.inputEl = document.getElementById(
      this.el.getAttribute("data-phx-upload-ref")
    );
    let src = this.inputEl.getAttribute(`pdf-preview-${this.ref}`);
    if (!src) {
      src = "https://poainc.org/wp-content/uploads/2018/06/pdf-placeholder.png";
    } else {
      this.inputEl.removeAttribute(`pdf-preview-${this.ref}`);
    }
    this.el.src = src;
    this.url = src;
  },
  destroyed() {
    if (this.url) {
      URL.revokeObjectURL(this.url);
    }
  },
};
Enter fullscreen mode Exit fullscreen mode

We have to add this synchronization since there is no guarantee that the PDF is rendered by the time the preview appears in the DOM. The preview is conditionally rendered in the LiveView when the entries are populated into the assigns.

<li :for={entry <- @uploads.demo.entries} class="relative">
  <div class="group aspect-h-7 aspect-w-10 block w-full overflow-hidden rounded-lg bg-gray-100 focus-within:ring-2 focus-within:ring-indigo-500 focus-within:ring-offset-2 focus-within:ring-offset-gray-100">
    <img
      id={"phx-preview-#{entry.ref}"}
      data-phx-upload-ref={entry.upload_ref}
      data-phx-entry-ref={entry.ref}
      data-phx-hook={
        if entry.client_type == "application/pdf",
          do: "LivePdfPreview",
          else: "Phoenix.LiveImgPreview"
      }
      data-phx-update="ignore"
    />
...
</li>
Enter fullscreen mode Exit fullscreen mode

Final Thoughts

It is worth noting that in the provided GitHub repo, there is some additional functionality to store the previews as a base64-encoded blob to be used later on. I didn't go over it specifically during this article since it is not core to the task at hand, but I needed it for my use case so I went ahead and included it in the code. The relevant code is the GatherPreviews hook in app.js and the handle_event("update_preview_srcs") handler in index.ex.

I want to emphasize that this very well might not be the best way to achieve this goal, but after reading the docs and source code of Phoenix Live View for quite some time it didn't seem that there was a clean API or support for customizing this behavior. I'd love to hear about any other methods in the comments below!

Top comments (0)