Learning Elixir: Pipe Operator
The pipe operator (|>) transforms the way we compose functions in Elixir, turning nested function calls into elegant, readable pipelines that flow naturally from left to right. This simple yet powerful operator embodies the essence of functional programming by making data transformations explicit and sequential, allowing you to express complex operations as a series of simple, composable steps. Rather than wrapping functions inside functions, creating hard-to-read nested structures, the pipe operator lets you chain operations together in a way that mirrors how we naturally think about data processing. In this article, we'll explore how this foundational Elixir feature can dramatically improve your code's readability and maintainability. Note: The examples in this article use Elixir 1.18.3. While most operations should work across different versions, some functionality might vary. Table of Contents Introduction Understanding the Pipe Operator Basic Usage Transformation Pipelines Combining with Pattern Matching Common Pitfalls and Solutions Advanced Techniques Real-World Examples Best Practices Conclusion Further Reading Next Steps Introduction In previous articles, we've explored pattern matching in functions and how it enables elegant, declarative code. Now, we'll discover how the pipe operator complements pattern matching and other Elixir features to create expressive data transformation pipelines. The pipe operator addresses a fundamental challenge in functional programming: how to compose multiple function calls without creating deeply nested, hard-to-read code. Instead of reading functions from the inside out, the pipe operator allows us to read them from left to right, following the natural flow of data transformation. Consider this simple example of data processing without the pipe operator: String.split(String.downcase(String.trim(" Hello World ")), " ") With the pipe operator, the same operation becomes: " Hello World " |> String.trim() |> String.downcase() |> String.split(" ") The transformation is immediately clearer: we start with a string, trim it, convert it to lowercase, and split it by spaces. This linear flow makes the code self-documenting and easier to understand. Understanding the Pipe Operator How It Works The pipe operator takes the result of the expression on its left and passes it as the first argument to the function on its right. This simple rule enables powerful function composition. # Without pipe operator result = function_c(function_b(function_a(value))) # With pipe operator result = value |> function_a() |> function_b() |> function_c() The First Argument Rule The key to understanding the pipe operator is remembering that it always passes the value as the first argument to the next function: # This: "hello" |> String.upcase() # Is equivalent to: String.upcase("hello") # And this: "hello world" |> String.split(" ") # Is equivalent to: String.split("hello world", " ") Testing in IEx: iex> "hello" |> String.upcase() "HELLO" iex> "hello world" |> String.split(" ") ["hello", "world"] iex> 10 |> Integer.to_string() "10" iex> [1, 2, 3] |> Enum.map(&(&1 * 2)) [2, 4, 6] Basic Usage Simple Transformations Let's start with basic examples to understand how the pipe operator works: defmodule BasicPipes do def process_name(name) do name |> String.trim() |> String.downcase() |> String.capitalize() end def calculate_total(items) do items |> Enum.map(&(&1.quantity * &1.price)) |> Enum.sum() end end Testing in IEx: iex> BasicPipes.process_name(" john DOE ") "John doe" iex> items = [%{quantity: 2, price: 10.5}, %{quantity: 1, price: 5.0}] [%{price: 10.5, quantity: 2}, %{price: 5.0, quantity: 1}] iex> BasicPipes.calculate_total(items) 26.0 Multiple Arguments When a function requires multiple arguments, the piped value becomes the first argument: defmodule MultiArgExample do def format_currency(amount) do amount |> to_float() |> Float.round(2) |> Float.to_string() |> String.split(".") |> format_parts() end defp to_float(value) when is_float(value), do: value defp to_float(value) when is_integer(value), do: value / 1 defp format_parts([whole, decimal]) do "$#{whole}.#{String.pad_trailing(decimal, 2, "0")}" end defp format_parts([whole]) do "$#{whole}.00" end end Testing in IEx: iex> MultiArgExample.format_currency(10.5) "$10.50" iex> MultiArgExample.format_currency(10) "$10.00" iex> MultiArgExample.format_currency(10.567) "$10.57" Working with Different Data Types The pipe operator works with any data type: defmodule DataTypeExamples do def process_list(list) do list |> Enum.filter(&(&1 > 0)) |> Enum.map(&(&1 * 2)) |> Enum.sort(:desc) end def process_map(map) do map |> Map.put(:

The pipe operator (|>
) transforms the way we compose functions in Elixir, turning nested function calls into elegant, readable pipelines that flow naturally from left to right. This simple yet powerful operator embodies the essence of functional programming by making data transformations explicit and sequential, allowing you to express complex operations as a series of simple, composable steps. Rather than wrapping functions inside functions, creating hard-to-read nested structures, the pipe operator lets you chain operations together in a way that mirrors how we naturally think about data processing. In this article, we'll explore how this foundational Elixir feature can dramatically improve your code's readability and maintainability.
Note: The examples in this article use Elixir 1.18.3. While most operations should work across different versions, some functionality might vary.
Table of Contents
- Introduction
- Understanding the Pipe Operator
- Basic Usage
- Transformation Pipelines
- Combining with Pattern Matching
- Common Pitfalls and Solutions
- Advanced Techniques
- Real-World Examples
- Best Practices
- Conclusion
- Further Reading
- Next Steps
Introduction
In previous articles, we've explored pattern matching in functions and how it enables elegant, declarative code. Now, we'll discover how the pipe operator complements pattern matching and other Elixir features to create expressive data transformation pipelines.
The pipe operator addresses a fundamental challenge in functional programming: how to compose multiple function calls without creating deeply nested, hard-to-read code. Instead of reading functions from the inside out, the pipe operator allows us to read them from left to right, following the natural flow of data transformation.
Consider this simple example of data processing without the pipe operator:
String.split(String.downcase(String.trim(" Hello World ")), " ")
With the pipe operator, the same operation becomes:
" Hello World " |> String.trim() |> String.downcase() |> String.split(" ")
The transformation is immediately clearer: we start with a string, trim it, convert it to lowercase, and split it by spaces. This linear flow makes the code self-documenting and easier to understand.
Understanding the Pipe Operator
How It Works
The pipe operator takes the result of the expression on its left and passes it as the first argument to the function on its right. This simple rule enables powerful function composition.
# Without pipe operator
result = function_c(function_b(function_a(value)))
# With pipe operator
result = value |> function_a() |> function_b() |> function_c()
The First Argument Rule
The key to understanding the pipe operator is remembering that it always passes the value as the first argument to the next function:
# This:
"hello" |> String.upcase()
# Is equivalent to:
String.upcase("hello")
# And this:
"hello world" |> String.split(" ")
# Is equivalent to:
String.split("hello world", " ")
Testing in IEx:
iex> "hello" |> String.upcase()
"HELLO"
iex> "hello world" |> String.split(" ")
["hello", "world"]
iex> 10 |> Integer.to_string()
"10"
iex> [1, 2, 3] |> Enum.map(&(&1 * 2))
[2, 4, 6]
Basic Usage
Simple Transformations
Let's start with basic examples to understand how the pipe operator works:
defmodule BasicPipes do
def process_name(name) do
name
|> String.trim()
|> String.downcase()
|> String.capitalize()
end
def calculate_total(items) do
items
|> Enum.map(&(&1.quantity * &1.price))
|> Enum.sum()
end
end
Testing in IEx:
iex> BasicPipes.process_name(" john DOE ")
"John doe"
iex> items = [%{quantity: 2, price: 10.5}, %{quantity: 1, price: 5.0}]
[%{price: 10.5, quantity: 2}, %{price: 5.0, quantity: 1}]
iex> BasicPipes.calculate_total(items)
26.0
Multiple Arguments
When a function requires multiple arguments, the piped value becomes the first argument:
defmodule MultiArgExample do
def format_currency(amount) do
amount
|> to_float()
|> Float.round(2)
|> Float.to_string()
|> String.split(".")
|> format_parts()
end
defp to_float(value) when is_float(value), do: value
defp to_float(value) when is_integer(value), do: value / 1
defp format_parts([whole, decimal]) do
"$#{whole}.#{String.pad_trailing(decimal, 2, "0")}"
end
defp format_parts([whole]) do
"$#{whole}.00"
end
end
Testing in IEx:
iex> MultiArgExample.format_currency(10.5)
"$10.50"
iex> MultiArgExample.format_currency(10)
"$10.00"
iex> MultiArgExample.format_currency(10.567)
"$10.57"
Working with Different Data Types
The pipe operator works with any data type:
defmodule DataTypeExamples do
def process_list(list) do
list
|> Enum.filter(&(&1 > 0))
|> Enum.map(&(&1 * 2))
|> Enum.sort(:desc)
end
def process_map(map) do
map
|> Map.put(:processed, true)
|> Map.update!(:count, &(&1 + 1))
|> Map.delete(:temp)
end
def process_tuple(tuple) do
tuple
|> Tuple.to_list()
|> Enum.reverse()
|> List.to_tuple()
end
end
Testing in IEx:
iex> DataTypeExamples.process_list([3, -1, 4, 1, 5, -2])
[10, 8, 6, 2]
iex> DataTypeExamples.process_map(%{count: 5, temp: "remove me", name: "test"})
%{count: 6, name: "test", processed: true}
iex> DataTypeExamples.process_tuple({1, 2, 3, 4})
{4, 3, 2, 1}
Transformation Pipelines
Building Complex Pipelines
The real power of the pipe operator becomes apparent when building complex data transformation pipelines:
defmodule UserProcessor do
def process_user_data(user_map) do
user_map
|> validate_fields()
|> normalize_data()
|> enrich_with_defaults()
|> calculate_derived_fields()
|> format_output()
end
defp validate_fields(data) when is_map(data) do
data
|> Map.take([:name, :email, :age])
|> Enum.filter(fn {_k, v} -> v != nil end)
|> Map.new()
end
defp validate_fields(_), do: %{}
defp normalize_data(data) do
data
|> Map.update(:name, "", &String.trim/1)
|> Map.update(:email, "", &String.downcase/1)
|> Map.update(:age, 0, &ensure_integer/1)
end
defp ensure_integer(value) when is_integer(value), do: value
defp ensure_integer(value) when is_binary(value) do
case Integer.parse(value) do
{int, _} -> int
:error -> 0
end
end
defp ensure_integer(_), do: 0
defp enrich_with_defaults(data) do
defaults = %{
status: "active",
created_at: DateTime.utc_now() |> DateTime.to_string()
}
Map.merge(defaults, data)
end
defp calculate_derived_fields(data) do
data
|> Map.put(:is_adult, data[:age] >= 18)
|> Map.put(:username, generate_username(data[:email]))
end
defp generate_username(""), do: ""
defp generate_username(email) do
email
|> String.split("@")
|> List.first()
|> String.replace(".", "_")
end
defp format_output(data) do
data
|> Map.to_list()
|> Enum.sort()
|> Enum.map(fn {k, v} -> "#{k}: #{inspect(v)}" end)
|> Enum.join("\n")
end
end
Testing in IEx:
iex> user_data = %{name: " John Doe ", email: "JOHN.DOE@EXAMPLE.COM", age: "25"}
%{age: "25", email: "JOHN.DOE@EXAMPLE.COM", name: " John Doe "}
iex> UserProcessor.process_user_data(user_data)
"age: 25\ncreated_at: \"2025-05-31 14:30:00.123456Z\"\nemail: \"john.doe@example.com\"\nis_adult: true\nname: \"John Doe\"\nstatus: \"active\"\nusername: \"john_doe\""
iex> incomplete_data = %{name: "Alice", age: "17"}
%{age: "17", name: "Alice"}
iex> UserProcessor.process_user_data(incomplete_data)
"age: 17\ncreated_at: \"2025-05-31 14:30:01.456789Z\"\nemail: \"\"\nis_adult: false\nname: \"Alice\"\nstatus: \"active\"\nusername: \"\""
iex> empty_data = %{}
%{}
iex> UserProcessor.process_user_data(empty_data)
"age: 0\ncreated_at: \"2025-05-31 14:30:02.789012Z\"\nemail: \"\"\nis_adult: false\nstatus: \"active\"\nusername: \"\""
iex> invalid_data = "not a map"
"not a map"
iex> UserProcessor.process_user_data(invalid_data)
"age: 0\ncreated_at: \"2025-05-31 14:30:03.345678Z\"\nemail: \"\"\nis_adult: false\nstatus: \"active\"\nusername: \"\""
Conditional Pipelines
Sometimes you need conditional logic within pipelines:
defmodule ConditionalPipeline do
def process_order(order) do
order
|> validate_order()
|> maybe_apply_discount()
|> calculate_tax()
|> add_shipping()
|> finalize_order()
end
defp validate_order(order) do
if order.items == [] do
{:error, "Order has no items"}
else
{:ok, order}
end
end
defp maybe_apply_discount({:error, _} = error), do: error
defp maybe_apply_discount({:ok, order}) do
if order.total > 100 do
{:ok, %{order | total: order.total * 0.9} |> Map.put(:discount_applied, true)}
else
{:ok, Map.put(order, :discount_applied, false)}
end
end
defp calculate_tax({:error, _} = error), do: error
defp calculate_tax({:ok, order}) do
tax = order.total * 0.08
{:ok, %{order | total: order.total + tax} |> Map.put(:tax, tax)}
end
defp add_shipping({:error, _} = error), do: error
defp add_shipping({:ok, order}) do
shipping = if order.total > 50, do: 0, else: 10
{:ok, %{order | total: order.total + shipping} |> Map.put(:shipping, shipping)}
end
defp finalize_order({:error, _} = error), do: error
defp finalize_order({:ok, order}) do
{:ok, Map.put(order, :status, "completed")}
end
end
Testing in IEx:
iex> order = %{items: ["item1", "item2"], total: 120}
%{items: ["item1", "item2"], total: 120}
iex> ConditionalPipeline.process_order(order)
{:ok,
%{
discount_applied: true,
items: ["item1", "item2"],
shipping: 0,
status: "completed",
tax: 8.64,
total: 116.64
}}
iex> small_order = %{items: ["item1"], total: 30}
%{items: ["item1"], total: 30}
iex> ConditionalPipeline.process_order(small_order)
{:ok,
%{
discount_applied: false,
items: ["item1"],
shipping: 10,
status: "completed",
tax: 2.4,
total: 42.4
}}
iex> empty_order = %{items: [], total: 0}
%{items: [], total: 0}
iex> ConditionalPipeline.process_order(empty_order)
{:error, "Order has no items"}
Combining with Pattern Matching
The pipe operator works seamlessly with pattern matching, creating powerful and expressive code:
defmodule PipeAndPattern do
def analyze_response(response) do
response
|> parse_response()
|> extract_data()
|> process_by_type()
|> format_result()
end
defp parse_response(%{"type" => _, "data" => _} = response) do
{:ok, response}
end
defp parse_response(%{} = response) do
{:error, :missing_fields}
end
defp parse_response(_) do
{:error, :invalid_format}
end
defp extract_data({:error, _} = error), do: error
defp extract_data({:ok, %{"type" => type, "data" => data}}) do
{:ok, {type, data}}
end
defp process_by_type({:error, _} = error), do: error
defp process_by_type({:ok, {"user", data}}) do
{:ok, {:user, process_user(data)}}
end
defp process_by_type({:ok, {"product", data}}) do
{:ok, {:product, process_product(data)}}
end
defp process_by_type({:ok, {type, _}}) do
{:error, {:unknown_type, type}}
end
defp process_user(data) do
data
|> Map.take(["name", "email"])
|> Map.put("processed_at", DateTime.utc_now())
end
defp process_product(data) do
data
|> Map.take(["name", "price"])
|> Map.update("price", 0.0, &parse_price/1)
end
defp parse_price(price) when is_number(price), do: price
defp parse_price(price) when is_binary(price) do
case Float.parse(price) do
{float, _} -> float
:error -> 0.0
end
end
defp parse_price(_), do: 0.0
defp format_result({:error, :missing_fields}), do: "Error: missing_fields"
defp format_result({:error, :invalid_format}), do: "Error: invalid_format"
defp format_result({:error, {:unknown_type, type}}), do: "Error: unknown_type #{type}"
defp format_result({:ok, {:user, data}}) do
"Processed user: #{data["name"]} (#{data["email"]}) at #{data["processed_at"]}"
end
defp format_result({:ok, {:product, data}}) do
"Processed product: #{data["name"]} - $#{data["price"]}"
end
end
Testing in IEx:
iex> user_response = %{"type" => "user", "data" => %{"name" => "Alice", "email" => "alice@example.com", "age" => 30}}
%{
"data" => %{"age" => 30, "email" => "alice@example.com", "name" => "Alice"},
"type" => "user"
}
iex> PipeAndPattern.analyze_response(user_response)
"Processed user: Alice (alice@example.com) at 2025-05-31 14:25:11.668875Z"
iex> product_response = %{"type" => "product", "data" => %{"name" => "Widget", "price" => "19.99", "stock" => 100}}
%{
"data" => %{"name" => "Widget", "price" => "19.99", "stock" => 100},
"type" => "product"
}
iex> PipeAndPattern.analyze_response(product_response)
"Processed product: Widget - $19.99"
iex> unknown_type = %{"type" => "order", "data" => %{"id" => 123}}
%{"data" => %{"id" => 123}, "type" => "order"}
iex> PipeAndPattern.analyze_response(unknown_type)
"Error: unknown_type order"
iex> invalid_response = %{"name" => "Missing type and data fields"}
%{"name" => "Missing type and data fields"}
iex> PipeAndPattern.analyze_response(invalid_response)
"Error: :missing_fields"
iex> PipeAndPattern.analyze_response("not a map")
"Error: :invalid_format"
Using case
and with
in Pipelines
Sometimes you need more complex control flow within pipelines:
defmodule AdvancedPipeline do
def process_file(filename) do
filename
|> File.read()
|> case do
{:ok, content} ->
content
|> String.split("\n")
|> Enum.map(&process_line/1)
|> Enum.filter(&(&1 != :skip))
{:error, reason} ->
{:error, "Failed to read file: #{reason}"}
end
end
defp process_line(line) do
line
|> String.trim()
|> case do
"" -> :skip
"#" <> _comment -> :skip
data -> String.split(data, ",")
end
end
def complex_validation(data) do
data
|> Map.new()
|> then(fn map ->
with {:ok, validated_name} <- validate_name(map[:name]),
{:ok, validated_age} <- validate_age(map[:age]),
{:ok, validated_email} <- validate_email(map[:email]) do
%{
name: validated_name,
age: validated_age,
email: validated_email
}
else
{:error, _} = error -> error
end
end)
end
defp validate_name(nil), do: {:error, "Name is required"}
defp validate_name(name) when is_binary(name) and byte_size(name) > 0, do: {:ok, name}
defp validate_name(_), do: {:error, "Invalid name"}
defp validate_age(nil), do: {:error, "Age is required"}
defp validate_age(age) when is_integer(age) and age >= 0, do: {:ok, age}
defp validate_age(_), do: {:error, "Invalid age"}
defp validate_email(nil), do: {:error, "Email is required"}
defp validate_email(email) when is_binary(email) do
if String.contains?(email, "@"), do: {:ok, email}, else: {:error, "Invalid email"}
end
defp validate_email(_), do: {:error, "Invalid email"}
end
Testing in IEx:
iex> content = "name,age\nAlice,25\n# comment\nBob,30\n"
"name,age\nAlice,25\n# comment\nBob,30\n"
iex> File.write!("/tmp/test.csv", content)
:ok
iex> AdvancedPipeline.process_file("/tmp/test.csv")
[["name", "age"], ["Alice", "25"], ["Bob", "30"]]
iex> AdvancedPipeline.process_file("/tmp/nonexistent.csv")
{:error, "Failed to read file: enoent"}
iex> valid_data = [name: "Alice", age: 25, email: "alice@example.com"]
[name: "Alice", age: 25, email: "alice@example.com"]
iex> AdvancedPipeline.complex_validation(valid_data)
%{age: 25, email: "alice@example.com", name: "Alice"}
iex> invalid_data = [name: "Charlie", age: -5, email: "invalid-email"]
[name: "Charlie", age: -5, email: "invalid-email"]
iex> AdvancedPipeline.complex_validation(invalid_data)
{:error, "Invalid age"}
iex> missing_name = [age: 30, email: "bob@example.com"]
[age: 30, email: "bob@example.com"]
iex> AdvancedPipeline.complex_validation(missing_name)
{:error, "Name is required"}
iex> File.rm("/tmp/test.csv")
:ok
Common Pitfalls and Solutions
Pitfall 1: Wrong Argument Position
The pipe operator always passes the value as the first argument. This can cause issues with functions that expect the piped value in a different position:
defmodule ArgumentPosition do
# Problem: Enum.member? expects the enumerable as the second argument
def has_admin_role_wrong(user) do
user
|> Map.get(:roles, [])
# This won't work as expected!
# |> Enum.member?("admin")
end
# Solution 1: Use an anonymous function
def has_admin_role_v1(user) do
user
|> Map.get(:roles, [])
|> then(fn roles -> Enum.member?(roles, "admin") end)
end
# Solution 2: Use the capture operator with proper positioning
def has_admin_role_v2(user) do
user
|> Map.get(:roles, [])
|> then(&Enum.member?(&1, "admin"))
end
# Solution 3: Create a helper function with arguments in the right order
def has_admin_role_v3(user) do
user
|> Map.get(:roles, [])
|> contains?("admin")
end
defp contains?(list, item), do: Enum.member?(list, item)
end
Testing in IEx:
iex> user = %{roles: ["user", "admin"]}
%{roles: ["user", "admin"]}
iex> ArgumentPosition.has_admin_role_v1(user)
true
iex> ArgumentPosition.has_admin_role_v2(user)
true
iex> ArgumentPosition.has_admin_role_v3(user)
true
Pitfall 2: Side Effects in Pipelines
Pipelines should be used for transformations, not side effects:
defmodule SideEffectPitfall do
# Bad: Mixing side effects with transformations
def process_data_bad(data) do
data
|> validate()
|> IO.inspect(label: "After validation") # Side effect!
|> transform()
|> save_to_db() # Side effect!
|> format_response()
end
# Better: Separate concerns
def process_data_good(data) do
with {:ok, validated} <- validate(data),
{:ok, transformed} <- transform(validated),
{:ok, saved} <- save_to_db(transformed) do
format_response(saved)
else
{:error, reason} -> {:error, reason}
end
end
# Best: Use tap for debugging without breaking the pipeline
def process_data_with_debugging(data) do
data
|> validate()
|> tap(&IO.inspect(&1, label: "After validation"))
|> transform()
|> tap(&log_transformation(&1))
|> format_response()
end
defp validate(data), do: {:ok, data}
defp transform({:ok, data}), do: {:ok, Map.put(data, :transformed, true)}
defp transform(data), do: {:ok, Map.put(data, :transformed, true)}
defp save_to_db({:ok, data}), do: {:ok, Map.put(data, :id, :rand.uniform(1000))}
defp save_to_db(data), do: {:ok, Map.put(data, :id, :rand.uniform(1000))}
defp format_response({:ok, data}), do: {:success, data}
defp format_response(data), do: {:success, data}
defp log_transformation({:ok, data}) do
IO.puts("Transformation complete: #{inspect(data)}")
end
defp log_transformation(data) do
IO.puts("Transformation complete: #{inspect(data)}")
end
end
Testing in IEx:
iex> data = %{name: "Alice", age: 25}
%{name: "Alice", age: 25}
iex> SideEffectPitfall.process_data_bad(data)
After validation: {:ok, %{name: "Alice", age: 25}}
{:success, %{age: 25, id: 432, name: "Alice", transformed: true}}
iex> SideEffectPitfall.process_data_good(data)
{:success, %{age: 25, id: 876, name: "Alice", transformed: true}}
iex> SideEffectPitfall.process_data_with_debugging(data)
After validation: {:ok, %{name: "Alice", age: 25}}
Transformation complete: {:ok, %{name: "Alice", age: 25, transformed: true}}
{:success, %{age: 25, name: "Alice", transformed: true}}
iex> empty_data = %{}
%{}
iex> SideEffectPitfall.process_data_with_debugging(empty_data)
After validation: {:ok, %{}}
Transformation complete: {:ok, %{transformed: true}}
{:success, %{transformed: true}}
Pitfall 3: Overusing the Pipe Operator
Not every sequence of function calls benefits from the pipe operator:
defmodule PipeOveruse do
# Overused: Single transformation doesn't need pipe
def get_name_bad(user) do
user
|> Map.get(:name)
end
# Better: Direct function call
def get_name_good(user) do
Map.get(user, :name)
end
# Overused: Complex branching logic
def complex_logic_bad(data) do
data
|> validate()
|> case do
{:ok, valid} ->
valid
|> process()
|> case do
{:ok, processed} ->
processed
|> finalize()
error -> error
end
error -> error
end
end
# Better: Use with for complex error handling
def complex_logic_good(data) do
with {:ok, valid} <- validate(data),
{:ok, processed} <- process(valid),
{:ok, finalized} <- finalize(processed) do
{:ok, finalized}
end
end
defp validate(%{valid: true} = data), do: {:ok, data}
defp validate(%{valid: false}), do: {:error, "Invalid data"}
defp validate(data), do: {:ok, data}
defp process(data), do: {:ok, Map.put(data, :processed, true)}
defp finalize(data), do: {:ok, Map.put(data, :finalized, true)}
end
Testing in IEx:
iex> user = %{name: "Alice", age: 25}
%{age: 25, name: "Alice"}
iex> PipeOveruse.get_name_bad(user)
"Alice"
iex> PipeOveruse.get_name_good(user)
"Alice"
iex> user_no_name = %{age: 30}
%{age: 30}
iex> PipeOveruse.get_name_bad(user_no_name)
nil
iex> PipeOveruse.get_name_good(user_no_name)
nil
iex> valid_data = %{id: 1, valid: true}
%{id: 1, valid: true}
iex> PipeOveruse.complex_logic_bad(valid_data)
{:ok, %{finalized: true, id: 1, processed: true, valid: true}}
iex> PipeOveruse.complex_logic_good(valid_data)
{:ok, %{finalized: true, id: 1, processed: true, valid: true}}
iex> invalid_data = %{id: 2, valid: false}
%{id: 2, valid: false}
iex> PipeOveruse.complex_logic_bad(invalid_data)
{:error, "Invalid data"}
iex> PipeOveruse.complex_logic_good(invalid_data)
{:error, "Invalid data"}
iex> simple_data = %{name: "test"}
%{name: "test"}
iex> PipeOveruse.complex_logic_good(simple_data)
{:ok, %{finalized: true, name: "test", processed: true}}
Pitfall 4: Parentheses and Precedence
Be careful with function calls that don't use parentheses:
defmodule ParenthesesPitfall do
# Problem: Inconsistent style without parentheses
def calculate_wrong(value) do
value
|> Integer.to_string
|> String.upcase # Works but inconsistent style
end
# Better: Always use parentheses in pipelines for clarity
def calculate_good(value) do
value
|> Integer.to_string()
|> then(fn str -> "Value: " <> str end)
end
# Alternative solution with helper function
def calculate_alternative(value) do
value
|> Integer.to_string()
|> prepend("Value: ")
end
# Problem: Complex expressions without clear grouping
def complex_without_parens(value) do
value
|> Integer.to_string
|> String.length
|> add_ten
end
# Better: Clear function calls with parentheses
def complex_with_parens(value) do
value
|> Integer.to_string()
|> String.length()
|> add_ten()
end
# Problem: Mixing operators and pipes can be confusing
def confusing_mix(values) when is_list(values) do
values
|> length
|> add_ten
end
# Better: Clear intent with parentheses
def clear_intent(values) when is_list(values) do
values
|> length()
|> add_ten()
end
defp prepend(string, prefix), do: prefix <> string
defp add_ten(value), do: value + 10
end
Testing in IEx:
iex> ParenthesesPitfall.calculate_wrong(42)
"42"
iex> ParenthesesPitfall.calculate_good(42)
"Value: 42"
iex> ParenthesesPitfall.calculate_alternative(42)
"Value: 42"
iex> ParenthesesPitfall.complex_without_parens(1234)
14
iex> ParenthesesPitfall.complex_with_parens(1234)
14
iex> ParenthesesPitfall.confusing_mix([1, 2, 3, 4, 5])
15
iex> ParenthesesPitfall.clear_intent([1, 2, 3, 4, 5])
15
iex> ParenthesesPitfall.calculate_alternative(999)
"Value: 999"
iex> large_list = Enum.to_list(1..100)
[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, ...]
iex> ParenthesesPitfall.clear_intent(large_list)
110
Advanced Techniques
Custom Pipe-Friendly Functions
Design your functions to work well with the pipe operator:
defmodule PipeFriendly do
# Design functions with the "subject" as the first parameter
def update_user_name(user, name) do
Map.put(user, :name, name)
end
def add_role(user, role) do
Map.update(user, :roles, [role], &[role | &1])
end
def set_status(user, status) do
Map.put(user, :status, status)
end
# Now they work beautifully in pipelines
def create_admin_user(name, email) do
%{email: email}
|> update_user_name(name)
|> add_role("admin")
|> add_role("user")
|> set_status("active")
end
end
Testing in IEx:
iex> PipeFriendly.create_admin_user("Alice", "alice@example.com")
%{
email: "alice@example.com",
name: "Alice",
roles: ["user", "admin"],
status: "active"
}
The then
Function
Elixir 1.12+ introduced the then
function for cases where you need more flexibility:
defmodule ThenExamples do
def calculate_statistics(numbers) do
numbers
|> Enum.sort()
|> then(fn sorted ->
%{
min: List.first(sorted),
max: List.last(sorted),
median: median(sorted),
mean: mean(numbers)
}
end)
end
defp median(sorted_list) do
len = length(sorted_list)
if rem(len, 2) == 1 do
Enum.at(sorted_list, div(len, 2))
else
mid = div(len, 2)
(Enum.at(sorted_list, mid - 1) + Enum.at(sorted_list, mid)) / 2
end
end
defp mean(list) do
Enum.sum(list) / length(list)
end
# Using then for conditional transformations
def process_optional_field(data, field, transformer \\ & &1) do
data
|> Map.get(field)
|> then(fn
nil -> nil
value -> transformer.(value)
end)
end
end
Testing in IEx:
iex> ThenExamples.calculate_statistics([5, 2, 8, 1, 9, 3])
%{max: 9, mean: 4.666666666666667, median: 4.0, min: 1}
iex> data = %{name: "john doe", age: "25"}
%{age: "25", name: "john doe"}
iex> ThenExamples.process_optional_field(data, :name, &String.upcase/1)
"JOHN DOE"
iex> ThenExamples.process_optional_field(data, :missing, &String.upcase/1)
nil
Tap for Side Effects
The tap
function allows you to perform side effects without breaking the pipeline:
defmodule TapExamples do
require Logger
def process_with_logging(data) do
data
|> validate()
|> tap(&log_step("Validation complete", &1))
|> transform()
|> tap(&log_step("Transformation complete", &1))
|> enrich()
|> tap(&save_checkpoint/1)
|> format_output()
end
defp validate(data), do: Map.put(data, :validated, true)
defp transform(data), do: Map.put(data, :transformed, true)
defp enrich(data), do: Map.put(data, :enriched, true)
defp format_output(data), do: {:ok, data}
defp log_step(message, data) do
IO.puts("#{message}: #{inspect(Map.keys(data))}")
end
defp save_checkpoint(data) do
# Simulate saving a checkpoint
IO.puts("Checkpoint saved with keys: #{inspect(Map.keys(data))}")
end
end
Testing in IEx:
iex> TapExamples.process_with_logging(%{id: 1, name: "test"})
Validation complete: [:id, :name, :validated]
Transformation complete: [:id, :name, :transformed, :validated]
Checkpoint saved with keys: [:enriched, :id, :name, :transformed, :validated]
{:ok,
%{enriched: true, id: 1, name: "test", transformed: true, validated: true}}
Real-World Examples
API Request Processing
defmodule ApiProcessor do
def handle_request(raw_request) do
raw_request
|> parse_request()
|> authenticate()
|> authorize()
|> validate_params()
|> execute_action()
|> format_response()
|> add_headers()
end
defp parse_request(raw) do
%{
method: raw[:method],
path: raw[:path],
params: raw[:params] || %{},
headers: raw[:headers] || %{},
user_token: get_in(raw, [:headers, "Authorization"])
}
end
defp authenticate(%{user_token: nil} = request) do
{:error, :unauthorized, request}
end
defp authenticate(%{user_token: token} = request) do
# Simulate token validation
if String.starts_with?(token, "valid_") do
{:ok, Map.put(request, :user_id, String.slice(token, 6..-1//1))}
else
{:error, :unauthorized, request}
end
end
defp authorize({:error, _, _} = error), do: error
defp authorize({:ok, %{path: path, user_id: user_id} = request}) do
# Simulate authorization check
if can_access?(user_id, path) do
{:ok, request}
else
{:error, :forbidden, request}
end
end
defp can_access?(_user_id, "/public" <> _), do: true
defp can_access?("admin", _), do: true
defp can_access?(_, _), do: false
defp validate_params({:error, _, _} = error), do: error
defp validate_params({:ok, %{params: params} = request}) do
if valid_params?(params) do
{:ok, request}
else
{:error, :bad_request, request}
end
end
defp valid_params?(params) do
Map.keys(params) |> Enum.all?(&is_binary/1)
end
defp execute_action({:error, _, _} = error), do: error
defp execute_action({:ok, %{method: "GET", path: path} = request}) do
{:ok, %{status: 200, body: "GET #{path} successful", request: request}}
end
defp execute_action({:ok, %{method: method} = request}) do
{:ok, %{status: 200, body: "#{method} successful", request: request}}
end
defp format_response({:error, reason, _request}) do
%{
status: status_code(reason),
body: %{error: to_string(reason)}
}
end
defp format_response({:ok, response}) do
%{
status: response.status,
body: response.body
}
end
defp status_code(:unauthorized), do: 401
defp status_code(:forbidden), do: 403
defp status_code(:bad_request), do: 400
defp status_code(_), do: 500
defp add_headers(response) do
Map.put(response, :headers, %{
"Content-Type" => "application/json",
"X-Request-ID" => generate_request_id()
})
end
defp generate_request_id do
:rand.uniform(1_000_000) |> Integer.to_string()
end
end
Testing in IEx:
iex> request = %{
method: "GET",
path: "/api/users",
headers: %{"Authorization" => "valid_admin"},
params: %{"page" => "1"}
}
%{
headers: %{"Authorization" => "valid_admin"},
method: "GET",
params: %{"page" => "1"},
path: "/api/users"
}
iex> ApiProcessor.handle_request(request)
%{
body: "GET /api/users successful",
headers: %{"Content-Type" => "application/json", "X-Request-ID" => "881694"},
status: 200
}
iex> bad_request = %{method: "POST", path: "/admin", headers: %{"Authorization" => "valid_user"}}
%{headers: %{"Authorization" => "valid_user"}, method: "POST", path: "/admin"}
iex> ApiProcessor.handle_request(bad_request)
%{
status: 403,
body: %{error: "forbidden"},
headers: %{"Content-Type" => "application/json", "X-Request-ID" => "931314"}
}
Best Practices
1. Keep Pipelines Readable
Aim for pipelines that tell a story:
# Good: Clear transformation steps
def process_order(order) do
order
|> validate_items()
|> calculate_subtotal()
|> apply_discounts()
|> add_taxes()
|> calculate_total()
end
# Bad: Unclear purpose
def process(data) do
data
|> f1()
|> f2()
|> f3()
|> f4()
end
2. Limit Pipeline Length
Break long pipelines into smaller, named functions:
# Too long
def process_everything(data) do
data
|> step1()
|> step2()
|> step3()
|> step4()
|> step5()
|> step6()
|> step7()
|> step8()
|> step9()
|> step10()
end
# Better: Group related operations
def process_everything(data) do
data
|> prepare_data()
|> process_data()
|> finalize_data()
end
defp prepare_data(data) do
data
|> step1()
|> step2()
|> step3()
end
defp process_data(data) do
data
|> step4()
|> step5()
|> step6()
|> step7()
end
defp finalize_data(data) do
data
|> step8()
|> step9()
|> step10()
end
3. Design Functions for Piping
Make the "subject" of the operation the first parameter:
# Good: User is the subject, first parameter
def update_user_email(user, email) do
%{user | email: email}
end
def add_user_role(user, role) do
%{user | roles: [role | user.roles]}
end
# Usage in pipeline
user
|> update_user_email("new@example.com")
|> add_user_role("admin")
# Bad: Subject is not first
def update_email(email, user) do
%{user | email: email}
end
# Awkward in pipeline
user
|> then(&update_email("new@example.com", &1))
4. Use Consistent Return Values
Make functions pipeline-friendly with consistent returns:
# Good: Consistent {:ok, result} or {:error, reason}
def validate_age(user) do
if user.age >= 18 do
{:ok, user}
else
{:error, "User must be 18 or older"}
end
end
def validate_email(user) do
if String.contains?(user.email, "@") do
{:ok, user}
else
{:error, "Invalid email format"}
end
end
# Bad: Inconsistent returns
def validate_age(user) do
if user.age >= 18, do: user, else: nil
end
def validate_email(user) do
if String.contains?(user.email, "@") do
true
else
{:error, "Invalid email"}
end
end
5. Handle Errors Gracefully
Design pipelines that can handle errors elegantly:
defmodule ErrorHandling do
def safe_pipeline(data) do
data
|> validate()
|> continue_if_ok(&transform/1)
|> continue_if_ok(&persist/1)
|> continue_if_ok(¬ify/1)
end
defp continue_if_ok({:error, _} = error, _fun), do: error
defp continue_if_ok({:ok, data}, fun), do: fun.(data)
defp validate(data) do
if valid?(data), do: {:ok, data}, else: {:error, "Invalid data"}
end
defp transform(data), do: {:ok, Map.put(data, :transformed, true)}
defp persist(data), do: {:ok, Map.put(data, :persisted, true)}
defp notify(data), do: {:ok, Map.put(data, :notified, true)}
defp valid?(data), do: is_map(data)
end
Conclusion
The pipe operator is a cornerstone of idiomatic Elixir code, transforming complex nested function calls into clear, linear transformations. By understanding its mechanics and following best practices, you can write code that is not only functional but also highly readable and maintainable.
Key takeaways from this article include:
- The pipe operator
|>
passes the result of the left expression as the first argument to the function on the right - Pipelines make data transformations explicit and easy to follow
- Design your functions with the "subject" as the first parameter to make them pipe-friendly
- Use
then
andtap
for more complex pipeline scenarios - Avoid common pitfalls like wrong argument positioning and overuse
- Break long pipelines into smaller, focused functions
- Maintain consistent return values for better composability
- The pipe operator works beautifully with pattern matching and other Elixir features
By mastering the pipe operator, you'll write Elixir code that reads like a clear description of data transformation, making your programs more maintainable and your intent more obvious.
Tip: When refactoring nested function calls, start from the innermost function and work your way out, creating a pipeline that flows from left to right. This often reveals the natural sequence of transformations and can highlight opportunities for extracting reusable functions.
Further Reading
- Elixir Documentation - Pipe Operator
- Elixir School - Pipe Operator
- Elixir Getting Started Guide - Pipe Operator
Next Steps
In the upcoming article, we'll explore Function Composition:
Function Composition
- Understanding function composition in functional programming
- Creating higher-order functions for composition
- Building composable function libraries
- Partial application and currying in Elixir
- Advanced composition patterns and techniques
- Combining composition with pipes for powerful abstractions
Function composition takes the concepts we've learned with the pipe operator to the next level, allowing us to build complex functionality from simple, reusable building blocks. We'll explore how to create functions that combine other functions, leading to more abstract and powerful programming patterns.