• Python Course
  • Python Basics
  • Interview Questions
  • Python Quiz
  • Popular Packages
  • Python Projects
  • Practice Python
  • AI With Python
  • Learn Python3
  • Python Automation
  • Python Web Dev
  • DSA with Python
  • Python OOPs
  • Dictionaries

Assignment Operators in Python

The Python Operators are used to perform operations on values and variables. These are the special symbols that carry out arithmetic, logical, and bitwise computations. The value the operator operates on is known as the Operand. Here, we will cover Different Assignment operators in Python .

Operators

=

Assign the value of the right side of the expression to the left side operandc = a + b 


+=

Add right side operand with left side operand and then assign the result to left operanda += b   

-=

Subtract right side operand from left side operand and then assign the result to left operanda -= b  


*=

Multiply right operand with left operand and then assign the result to the left operanda *= b     


/=

Divide left operand with right operand and then assign the result to the left operanda /= b


%=

Divides the left operand with the right operand and then assign the remainder to the left operanda %= b  


//=

Divide left operand with right operand and then assign the value(floor) to left operanda //= b   


**=

Calculate exponent(raise power) value using operands and then assign the result to left operanda **= b     


&=

Performs Bitwise AND on operands and assign the result to left operanda &= b   


|=

Performs Bitwise OR on operands and assign the value to left operanda |= b    


^=

Performs Bitwise XOR on operands and assign the value to left operanda ^= b    


>>=

Performs Bitwise right shift on operands and assign the result to left operanda >>= b     


<<=

Performs Bitwise left shift on operands and assign the result to left operanda <<= b 


:=

Assign a value to a variable within an expression

a := exp

Here are the Assignment Operators in Python with examples.

Assignment Operator

Assignment Operators are used to assign values to variables. This operator is used to assign the value of the right side of the expression to the left side operand.

Addition Assignment Operator

The Addition Assignment Operator is used to add the right-hand side operand with the left-hand side operand and then assigning the result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the addition assignment operator which will first perform the addition operation and then assign the result to the variable on the left-hand side.

S ubtraction Assignment Operator

The Subtraction Assignment Operator is used to subtract the right-hand side operand from the left-hand side operand and then assigning the result to the left-hand side operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the subtraction assignment operator which will first perform the subtraction operation and then assign the result to the variable on the left-hand side.

M ultiplication Assignment Operator

The Multiplication Assignment Operator is used to multiply the right-hand side operand with the left-hand side operand and then assigning the result to the left-hand side operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the multiplication assignment operator which will first perform the multiplication operation and then assign the result to the variable on the left-hand side.

D ivision Assignment Operator

The Division Assignment Operator is used to divide the left-hand side operand with the right-hand side operand and then assigning the result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the division assignment operator which will first perform the division operation and then assign the result to the variable on the left-hand side.

M odulus Assignment Operator

The Modulus Assignment Operator is used to take the modulus, that is, it first divides the operands and then takes the remainder and assigns it to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the modulus assignment operator which will first perform the modulus operation and then assign the result to the variable on the left-hand side.

F loor Division Assignment Operator

The Floor Division Assignment Operator is used to divide the left operand with the right operand and then assigs the result(floor value) to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the floor division assignment operator which will first perform the floor division operation and then assign the result to the variable on the left-hand side.

Exponentiation Assignment Operator

The Exponentiation Assignment Operator is used to calculate the exponent(raise power) value using operands and then assigning the result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the exponentiation assignment operator which will first perform exponent operation and then assign the result to the variable on the left-hand side.

Bitwise AND Assignment Operator

The Bitwise AND Assignment Operator is used to perform Bitwise AND operation on both operands and then assigning the result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the bitwise AND assignment operator which will first perform Bitwise AND operation and then assign the result to the variable on the left-hand side.

Bitwise OR Assignment Operator

The Bitwise OR Assignment Operator is used to perform Bitwise OR operation on the operands and then assigning result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the bitwise OR assignment operator which will first perform bitwise OR operation and then assign the result to the variable on the left-hand side.

Bitwise XOR Assignment Operator 

The Bitwise XOR Assignment Operator is used to perform Bitwise XOR operation on the operands and then assigning result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the bitwise XOR assignment operator which will first perform bitwise XOR operation and then assign the result to the variable on the left-hand side.

Bitwise Right Shift Assignment Operator

The Bitwise Right Shift Assignment Operator is used to perform Bitwise Right Shift Operation on the operands and then assign result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the bitwise right shift assignment operator which will first perform bitwise right shift operation and then assign the result to the variable on the left-hand side.

Bitwise Left Shift Assignment Operator

The Bitwise Left Shift Assignment Operator is used to perform Bitwise Left Shift Opertator on the operands and then assign result to the left operand.

Example: In this code we have two variables ‘a’ and ‘b’ and assigned them with some integer value. Then we have used the bitwise left shift assignment operator which will first perform bitwise left shift operation and then assign the result to the variable on the left-hand side.

Walrus Operator

The Walrus Operator in Python is a new assignment operator which is introduced in Python version 3.8 and higher. This operator is used to assign a value to a variable within an expression.

Example: In this code, we have a Python list of integers. We have used Python Walrus assignment operator within the Python while loop . The operator will solve the expression on the right-hand side and assign the value to the left-hand side operand ‘x’ and then execute the remaining code.

Assignment Operators in Python – FAQs

What are assignment operators in python.

Assignment operators in Python are used to assign values to variables. These operators can also perform additional operations during the assignment. The basic assignment operator is = , which simply assigns the value of the right-hand operand to the left-hand operand. Other common assignment operators include += , -= , *= , /= , %= , and more, which perform an operation on the variable and then assign the result back to the variable.

What is the := Operator in Python?

The := operator, introduced in Python 3.8, is known as the “walrus operator”. It is an assignment expression, which means that it assigns values to variables as part of a larger expression. Its main benefit is that it allows you to assign values to variables within expressions, including within conditions of loops and if statements, thereby reducing the need for additional lines of code. Here’s an example: # Example of using the walrus operator in a while loop while (n := int(input("Enter a number (0 to stop): "))) != 0: print(f"You entered: {n}") This loop continues to prompt the user for input and immediately uses that input in both the condition check and the loop body.

What is the Assignment Operator in Structure?

In programming languages that use structures (like C or C++), the assignment operator = is used to copy values from one structure variable to another. Each member of the structure is copied from the source structure to the destination structure. Python, however, does not have a built-in concept of ‘structures’ as in C or C++; instead, similar functionality is achieved through classes or dictionaries.

What is the Assignment Operator in Python Dictionary?

In Python dictionaries, the assignment operator = is used to assign a new key-value pair to the dictionary or update the value of an existing key. Here’s how you might use it: my_dict = {} # Create an empty dictionary my_dict['key1'] = 'value1' # Assign a new key-value pair my_dict['key1'] = 'updated value' # Update the value of an existing key print(my_dict) # Output: {'key1': 'updated value'}

What is += and -= in Python?

The += and -= operators in Python are compound assignment operators. += adds the right-hand operand to the left-hand operand and assigns the result to the left-hand operand. Conversely, -= subtracts the right-hand operand from the left-hand operand and assigns the result to the left-hand operand. Here are examples of both: # Example of using += a = 5 a += 3 # Equivalent to a = a + 3 print(a) # Output: 8 # Example of using -= b = 10 b -= 4 # Equivalent to b = b - 4 print(b) # Output: 6 These operators make code more concise and are commonly used in loops and iterative data processing.

author

Please Login to comment...

Similar reads.

  • Python-Operators
  • How to Get a Free SSL Certificate
  • Best SSL Certificates Provider in India
  • Elon Musk's xAI releases Grok-2 AI assistant
  • What is OpenAI SearchGPT? How it works and How to Get it?
  • Content Improvement League 2024: From Good To A Great Article

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Python Tutorial

File handling, python modules, python numpy, python pandas, python matplotlib, python scipy, machine learning, python mysql, python mongodb, python reference, module reference, python how to, python examples, python assignment operators.

Assignment operators are used to assign values to variables:

Operator Example Same As Try it
= x = 5 x = 5
+= x += 3 x = x + 3
-= x -= 3 x = x - 3
*= x *= 3 x = x * 3
/= x /= 3 x = x / 3
%= x %= 3 x = x % 3
//= x //= 3 x = x // 3
**= x **= 3 x = x ** 3
&= x &= 3 x = x & 3
|= x |= 3 x = x | 3
^= x ^= 3 x = x ^ 3
>>= x >>= 3 x = x >> 3
<<= x <<= 3 x = x << 3

Related Pages

Get Certified

COLOR PICKER

colorpicker

Contact Sales

If you want to use W3Schools services as an educational institution, team or enterprise, send us an e-mail: [email protected]

Report Error

If you want to report an error, or if you want to make a suggestion, send us an e-mail: [email protected]

Top Tutorials

Top references, top examples, get certified.

Python Tutorial

  • Python Basics
  • Python - Home
  • Python - Overview
  • Python - History
  • Python - Features
  • Python vs C++
  • Python - Hello World Program
  • Python - Application Areas
  • Python - Interpreter
  • Python - Environment Setup
  • Python - Virtual Environment
  • Python - Basic Syntax
  • Python - Variables
  • Python - Data Types
  • Python - Type Casting
  • Python - Unicode System
  • Python - Literals
  • Python - Operators
  • Python - Arithmetic Operators
  • Python - Comparison Operators

Python - Assignment Operators

  • Python - Logical Operators
  • Python - Bitwise Operators
  • Python - Membership Operators
  • Python - Identity Operators
  • Python - Operator Precedence
  • Python - Comments
  • Python - User Input
  • Python - Numbers
  • Python - Booleans
  • Python Control Statements
  • Python - Control Flow
  • Python - Decision Making
  • Python - If Statement
  • Python - If else
  • Python - Nested If
  • Python - Match-Case Statement
  • Python - Loops
  • Python - for Loops
  • Python - for-else Loops
  • Python - While Loops
  • Python - break Statement
  • Python - continue Statement
  • Python - pass Statement
  • Python - Nested Loops
  • Python Functions & Modules
  • Python - Functions
  • Python - Default Arguments
  • Python - Keyword Arguments
  • Python - Keyword-Only Arguments
  • Python - Positional Arguments
  • Python - Positional-Only Arguments
  • Python - Arbitrary Arguments
  • Python - Variables Scope
  • Python - Function Annotations
  • Python - Modules
  • Python - Built in Functions
  • Python Strings
  • Python - Strings
  • Python - Slicing Strings
  • Python - Modify Strings
  • Python - String Concatenation
  • Python - String Formatting
  • Python - Escape Characters
  • Python - String Methods
  • Python - String Exercises
  • Python Lists
  • Python - Lists
  • Python - Access List Items
  • Python - Change List Items
  • Python - Add List Items
  • Python - Remove List Items
  • Python - Loop Lists
  • Python - List Comprehension
  • Python - Sort Lists
  • Python - Copy Lists
  • Python - Join Lists
  • Python - List Methods
  • Python - List Exercises
  • Python Tuples
  • Python - Tuples
  • Python - Access Tuple Items
  • Python - Update Tuples
  • Python - Unpack Tuples
  • Python - Loop Tuples
  • Python - Join Tuples
  • Python - Tuple Methods
  • Python - Tuple Exercises
  • Python Sets
  • Python - Sets
  • Python - Access Set Items
  • Python - Add Set Items
  • Python - Remove Set Items
  • Python - Loop Sets
  • Python - Join Sets
  • Python - Copy Sets
  • Python - Set Operators
  • Python - Set Methods
  • Python - Set Exercises
  • Python Dictionaries
  • Python - Dictionaries
  • Python - Access Dictionary Items
  • Python - Change Dictionary Items
  • Python - Add Dictionary Items
  • Python - Remove Dictionary Items
  • Python - Dictionary View Objects
  • Python - Loop Dictionaries
  • Python - Copy Dictionaries
  • Python - Nested Dictionaries
  • Python - Dictionary Methods
  • Python - Dictionary Exercises
  • Python Arrays
  • Python - Arrays
  • Python - Access Array Items
  • Python - Add Array Items
  • Python - Remove Array Items
  • Python - Loop Arrays
  • Python - Copy Arrays
  • Python - Reverse Arrays
  • Python - Sort Arrays
  • Python - Join Arrays
  • Python - Array Methods
  • Python - Array Exercises
  • Python File Handling
  • Python - File Handling
  • Python - Write to File
  • Python - Read Files
  • Python - Renaming and Deleting Files
  • Python - Directories
  • Python - File Methods
  • Python - OS File/Directory Methods
  • Python - OS Path Methods
  • Object Oriented Programming
  • Python - OOPs Concepts
  • Python - Classes & Objects
  • Python - Class Attributes
  • Python - Class Methods
  • Python - Static Methods
  • Python - Constructors
  • Python - Access Modifiers
  • Python - Inheritance
  • Python - Polymorphism
  • Python - Method Overriding
  • Python - Method Overloading
  • Python - Dynamic Binding
  • Python - Dynamic Typing
  • Python - Abstraction
  • Python - Encapsulation
  • Python - Interfaces
  • Python - Packages
  • Python - Inner Classes
  • Python - Anonymous Class and Objects
  • Python - Singleton Class
  • Python - Wrapper Classes
  • Python - Enums
  • Python - Reflection
  • Python Errors & Exceptions
  • Python - Syntax Errors
  • Python - Exceptions
  • Python - try-except Block
  • Python - try-finally Block
  • Python - Raising Exceptions
  • Python - Exception Chaining
  • Python - Nested try Block
  • Python - User-defined Exception
  • Python - Logging
  • Python - Assertions
  • Python - Built-in Exceptions
  • Python Multithreading
  • Python - Multithreading
  • Python - Thread Life Cycle
  • Python - Creating a Thread
  • Python - Starting a Thread
  • Python - Joining Threads
  • Python - Naming Thread
  • Python - Thread Scheduling
  • Python - Thread Pools
  • Python - Main Thread
  • Python - Thread Priority
  • Python - Daemon Threads
  • Python - Synchronizing Threads
  • Python Synchronization
  • Python - Inter-thread Communication
  • Python - Thread Deadlock
  • Python - Interrupting a Thread
  • Python Networking
  • Python - Networking
  • Python - Socket Programming
  • Python - URL Processing
  • Python - Generics
  • Python Libraries
  • NumPy Tutorial
  • Pandas Tutorial
  • SciPy Tutorial
  • Matplotlib Tutorial
  • Django Tutorial
  • OpenCV Tutorial
  • Python Miscellenous
  • Python - Date & Time
  • Python - Maths
  • Python - Iterators
  • Python - Generators
  • Python - Closures
  • Python - Decorators
  • Python - Recursion
  • Python - Reg Expressions
  • Python - PIP
  • Python - Database Access
  • Python - Weak References
  • Python - Serialization
  • Python - Templating
  • Python - Output Formatting
  • Python - Performance Measurement
  • Python - Data Compression
  • Python - CGI Programming
  • Python - XML Processing
  • Python - GUI Programming
  • Python - Command-Line Arguments
  • Python - Docstrings
  • Python - JSON
  • Python - Sending Email
  • Python - Further Extensions
  • Python - Tools/Utilities
  • Python - GUIs
  • Python Advanced Concepts
  • Python - Abstract Base Classes
  • Python - Custom Exceptions
  • Python - Higher Order Functions
  • Python - Object Internals
  • Python - Memory Management
  • Python - Metaclasses
  • Python - Metaprogramming with Metaclasses
  • Python - Mocking and Stubbing
  • Python - Monkey Patching
  • Python - Signal Handling
  • Python - Type Hints
  • Python - Automation Tutorial
  • Python - Humanize Package
  • Python - Context Managers
  • Python - Coroutines
  • Python - Descriptors
  • Python - Diagnosing and Fixing Memory Leaks
  • Python - Immutable Data Structures
  • Python Useful Resources
  • Python - Questions & Answers
  • Python - Online Quiz
  • Python - Quick Guide
  • Python - Projects
  • Python - Useful Resources
  • Python - Discussion
  • Python Compiler
  • NumPy Compiler
  • Matplotlib Compiler
  • SciPy Compiler
  • Python - Programming Examples
  • Selected Reading
  • UPSC IAS Exams Notes
  • Developer's Best Practices
  • Questions and Answers
  • Effective Resume Writing
  • HR Interview Questions
  • Computer Glossary

Python Assignment Operator

The = (equal to) symbol is defined as assignment operator in Python. The value of Python expression on its right is assigned to a single variable on its left. The = symbol as in programming in general (and Python in particular) should not be confused with its usage in Mathematics, where it states that the expressions on the either side of the symbol are equal.

Example of Assignment Operator in Python

Consider following Python statements −

At the first instance, at least for somebody new to programming but who knows maths, the statement "a=a+b" looks strange. How could a be equal to "a+b"? However, it needs to be reemphasized that the = symbol is an assignment operator here and not used to show the equality of LHS and RHS.

Because it is an assignment, the expression on right evaluates to 15, the value is assigned to a.

In the statement "a+=b", the two operators "+" and "=" can be combined in a "+=" operator. It is called as add and assign operator. In a single statement, it performs addition of two operands "a" and "b", and result is assigned to operand on left, i.e., "a".

Augmented Assignment Operators in Python

In addition to the simple assignment operator, Python provides few more assignment operators for advanced use. They are called cumulative or augmented assignment operators. In this chapter, we shall learn to use augmented assignment operators defined in Python.

Python has the augmented assignment operators for all arithmetic and comparison operators.

Python augmented assignment operators combines addition and assignment in one statement. Since Python supports mixed arithmetic, the two operands may be of different types. However, the type of left operand changes to the operand of on right, if it is wider.

The += operator is an augmented operator. It is also called cumulative addition operator, as it adds "b" in "a" and assigns the result back to a variable.

The following are the augmented assignment operators in Python:

  • Augmented Addition Operator
  • Augmented Subtraction Operator
  • Augmented Multiplication Operator
  • Augmented Division Operator
  • Augmented Modulus Operator
  • Augmented Exponent Operator
  • Augmented Floor division Operator

Augmented Addition Operator (+=)

Following examples will help in understanding how the "+=" operator works −

It will produce the following output −

Augmented Subtraction Operator (-=)

Use -= symbol to perform subtract and assign operations in a single statement. The "a-=b" statement performs "a=a-b" assignment. Operands may be of any number type. Python performs implicit type casting on the object which is narrower in size.

Augmented Multiplication Operator (*=)

The "*=" operator works on similar principle. "a*=b" performs multiply and assign operations, and is equivalent to "a=a*b". In case of augmented multiplication of two complex numbers, the rule of multiplication as discussed in the previous chapter is applicable.

Augmented Division Operator (/=)

The combination symbol "/=" acts as divide and assignment operator, hence "a/=b" is equivalent to "a=a/b". The division operation of int or float operands is float. Division of two complex numbers returns a complex number. Given below are examples of augmented division operator.

Augmented Modulus Operator (%=)

To perform modulus and assignment operation in a single statement, use the %= operator. Like the mod operator, its augmented version also is not supported for complex number.

Augmented Exponent Operator (**=)

The "**=" operator results in computation of "a" raised to "b", and assigning the value back to "a". Given below are some examples −

Augmented Floor division Operator (//=)

For performing floor division and assignment in a single statement, use the "//=" operator. "a//=b" is equivalent to "a=a//b". This operator cannot be used with complex numbers.

Assignment Operators

Add and assign, subtract and assign, multiply and assign, divide and assign, floor divide and assign, exponent and assign, modulo and assign.

to

to and assigns the result to

from and assigns the result to

by and assigns the result to

with and assigns the result to ; the result is always a float

with and assigns the result to ; the result will be dependent on the type of values used

to the power of and assigns the result to

is divided by and assigns the result to

For demonstration purposes, let’s use a single variable, num . Initially, we set num to 6. We can apply all of these operators to num and update it accordingly.

Assigning the value of 6 to num results in num being 6.

Expression: num = 6

Adding 3 to num and assigning the result back to num would result in 9.

Expression: num += 3

Subtracting 3 from num and assigning the result back to num would result in 6.

Expression: num -= 3

Multiplying num by 3 and assigning the result back to num would result in 18.

Expression: num *= 3

Dividing num by 3 and assigning the result back to num would result in 6.0 (always a float).

Expression: num /= 3

Performing floor division on num by 3 and assigning the result back to num would result in 2.

Expression: num //= 3

Raising num to the power of 3 and assigning the result back to num would result in 216.

Expression: num **= 3

Calculating the remainder when num is divided by 3 and assigning the result back to num would result in 2.

Expression: num %= 3

We can effectively put this into Python code, and you can experiment with the code yourself! Click the “Run” button to see the output.

The above code is useful when we want to update the same number. We can also use two different numbers and use the assignment operators to apply them on two different values.

Python Operators: Arithmetic, Assignment, Comparison, Logical, Identity, Membership, Bitwise

Operators are special symbols that perform some operation on operands and returns the result. For example, 5 + 6 is an expression where + is an operator that performs arithmetic add operation on numeric left operand 5 and the right side operand 6 and returns a sum of two operands as a result.

Python includes the operator module that includes underlying methods for each operator. For example, the + operator calls the operator.add(a,b) method.

Above, expression 5 + 6 is equivalent to the expression operator.add(5, 6) and operator.__add__(5, 6) . Many function names are those used for special methods, without the double underscores (dunder methods). For backward compatibility, many of these have functions with the double underscores kept.

Python includes the following categories of operators:

Arithmetic Operators

Assignment operators, comparison operators, logical operators, identity operators, membership test operators, bitwise operators.

Arithmetic operators perform the common mathematical operation on the numeric operands.

The arithmetic operators return the type of result depends on the type of operands, as below.

  • If either operand is a complex number, the result is converted to complex;
  • If either operand is a floating point number, the result is converted to floating point;
  • If both operands are integers, then the result is an integer and no conversion is needed.

The following table lists all the arithmetic operators in Python:

Operation Operator Function Example in Python Shell
Sum of two operands + operator.add(a,b)
Left operand minus right operand - operator.sub(a,b)
* operator.mul(a,b)
Left operand raised to the power of right ** operator.pow(a,b)
/ operator.truediv(a,b)
equivilant to // operator.floordiv(a,b)
Reminder of % operator.mod(a, b)

The assignment operators are used to assign values to variables. The following table lists all the arithmetic operators in Python:

Operator Function Example in Python Shell
=
+= operator.iadd(a,b)
-= operator.isub(a,b)
*= operator.imul(a,b)
/= operator.itruediv(a,b)
//= operator.ifloordiv(a,b)
%= operator.imod(a, b)
&= operator.iand(a, b)
|= operator.ior(a, b)
^= operator.ixor(a, b)
>>= operator.irshift(a, b)
<<= operator.ilshift(a, b)

The comparison operators compare two operands and return a boolean either True or False. The following table lists comparison operators in Python.

Operator Function Description Example in Python Shell
> operator.gt(a,b) True if the left operand is higher than the right one
< operator.lt(a,b) True if the left operand is lower than right one
== operator.eq(a,b) True if the operands are equal
!= operator.ne(a,b) True if the operands are not equal
>= operator.ge(a,b) True if the left operand is higher than or equal to the right one
<= operator.le(a,b) True if the left operand is lower than or equal to the right one

The logical operators are used to combine two boolean expressions. The logical operations are generally applicable to all objects, and support truth tests, identity tests, and boolean operations.

Operator Description Example
and True if both are true
or True if at least one is true
not Returns True if an expression evalutes to false and vice-versa

The identity operators check whether the two objects have the same id value e.i. both the objects point to the same memory location.

Operator Function Description Example in Python Shell
is operator.is_(a,b) True if both are true
is not operator.is_not(a,b) True if at least one is true

The membership test operators in and not in test whether the sequence has a given item or not. For the string and bytes types, x in y is True if and only if x is a substring of y .

Operator Function Description Example in Python Shell
in operator.contains(a,b) Returns True if the sequence contains the specified item else returns False.
not in not operator.contains(a,b) Returns True if the sequence does not contains the specified item, else returns False.

Bitwise operators perform operations on binary operands.

Operator Function Description Example in Python Shell
& operator.and_(a,b) Sets each bit to 1 if both bits are 1.
| operator.or_(a,b) Sets each bit to 1 if one of two bits is 1.
^ operator.xor(a,b) Sets each bit to 1 if only one of two bits is 1.
~ operator.invert(a) Inverts all the bits.
<< operator.lshift(a,b) Shift left by pushing zeros in from the right and let the leftmost bits fall off.
>> operator.rshift(a,b) Shift right by pushing copies of the leftmost bit in from the left, and let the rightmost bits fall off.
  • Compare strings in Python
  • Convert file data to list
  • Convert User Input to a Number
  • Convert String to Datetime in Python
  • How to call external commands in Python?
  • How to count the occurrences of a list item?
  • How to flatten list in Python?
  • How to merge dictionaries in Python?
  • How to pass value by reference in Python?
  • Remove duplicate items from list in Python
  • More Python articles

assignment define python

We are a team of passionate developers, educators, and technology enthusiasts who, with their combined expertise and experience, create in -depth, comprehensive, and easy to understand tutorials.We focus on a blend of theoretical explanations and practical examples to encourages hands - on learning. Visit About Us page for more information.

  • Python Questions & Answers
  • Python Skill Test
  • Python Latest Articles

PrepBytes Blog

ONE-STOP RESOURCE FOR EVERYTHING RELATED TO CODING

Sign in to your account

Forgot your password?

Login via OTP

We will send you an one time password on your mobile number

An OTP has been sent to your mobile number please verify it below

Register with PrepBytes

Assignment operator in python.

' src=

Last Updated on June 8, 2023 by Prepbytes

assignment define python

To fully comprehend the assignment operators in Python, it is important to have a basic understanding of what operators are. Operators are utilized to carry out a variety of operations, including mathematical, bitwise, and logical operations, among others, by connecting operands. Operands are the values that are acted upon by operators. In Python, the assignment operator is used to assign a value to a variable. The assignment operator is represented by the equals sign (=), and it is the most commonly used operator in Python. In this article, we will explore the assignment operator in Python, how it works, and its different types.

What is an Assignment Operator in Python?

The assignment operator in Python is used to assign a value to a variable. The assignment operator is represented by the equals sign (=), and it is used to assign a value to a variable. When an assignment operator is used, the value on the right-hand side is assigned to the variable on the left-hand side. This is a fundamental operation in programming, as it allows developers to store data in variables that can be used throughout their code.

For example, consider the following line of code:

Explanation: In this case, the value 10 is assigned to the variable a using the assignment operator. The variable a now holds the value 10, and this value can be used in other parts of the code. This simple example illustrates the basic usage and importance of assignment operators in Python programming.

Types of Assignment Operator in Python

There are several types of assignment operator in Python that are used to perform different operations. Let’s explore each type of assignment operator in Python in detail with the help of some code examples.

1. Simple Assignment Operator (=)

The simple assignment operator is the most commonly used operator in Python. It is used to assign a value to a variable. The syntax for the simple assignment operator is:

Here, the value on the right-hand side of the equals sign is assigned to the variable on the left-hand side. For example

Explanation: In this case, the value 25 is assigned to the variable a using the simple assignment operator. The variable a now holds the value 25.

2. Addition Assignment Operator (+=)

The addition assignment operator is used to add a value to a variable and store the result in the same variable. The syntax for the addition assignment operator is:

Here, the value on the right-hand side is added to the variable on the left-hand side, and the result is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is incremented by 5 using the addition assignment operator. The result, 15, is then printed to the console.

3. Subtraction Assignment Operator (-=)

The subtraction assignment operator is used to subtract a value from a variable and store the result in the same variable. The syntax for the subtraction assignment operator is

Here, the value on the right-hand side is subtracted from the variable on the left-hand side, and the result is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is decremented by 5 using the subtraction assignment operator. The result, 5, is then printed to the console.

4. Multiplication Assignment Operator (*=)

The multiplication assignment operator is used to multiply a variable by a value and store the result in the same variable. The syntax for the multiplication assignment operator is:

Here, the value on the right-hand side is multiplied by the variable on the left-hand side, and the result is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is multiplied by 5 using the multiplication assignment operator. The result, 50, is then printed to the console.

5. Division Assignment Operator (/=)

The division assignment operator is used to divide a variable by a value and store the result in the same variable. The syntax for the division assignment operator is:

Here, the variable on the left-hand side is divided by the value on the right-hand side, and the result is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is divided by 5 using the division assignment operator. The result, 2.0, is then printed to the console.

6. Modulus Assignment Operator (%=)

The modulus assignment operator is used to find the remainder of the division of a variable by a value and store the result in the same variable. The syntax for the modulus assignment operator is

Here, the variable on the left-hand side is divided by the value on the right-hand side, and the remainder is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is divided by 3 using the modulus assignment operator. The remainder, 1, is then printed to the console.

7. Floor Division Assignment Operator (//=)

The floor division assignment operator is used to divide a variable by a value and round the result down to the nearest integer, and store the result in the same variable. The syntax for the floor division assignment operator is:

Here, the variable on the left-hand side is divided by the value on the right-hand side, and the result is rounded down to the nearest integer. The rounded result is then stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is divided by 3 using the floor division assignment operator. The result, 3, is then printed to the console.

8. Exponentiation Assignment Operator (**=)

The exponentiation assignment operator is used to raise a variable to the power of a value and store the result in the same variable. The syntax for the exponentiation assignment operator is:

Here, the variable on the left-hand side is raised to the power of the value on the right-hand side, and the result is stored back in the variable on the left-hand side. For example

Explanation: In this case, the value of a is raised to the power of 3 using the exponentiation assignment operator. The result, 8, is then printed to the console.

9. Bitwise AND Assignment Operator (&=)

The bitwise AND assignment operator is used to perform a bitwise AND operation on the binary representation of a variable and a value, and store the result in the same variable. The syntax for the bitwise AND assignment operator is:

Here, the variable on the left-hand side is ANDed with the value on the right-hand side using the bitwise AND operator, and the result is stored back in the variable on the left-hand side. For example,

Explanation: In this case, the value of a is ANDed with 3 using the bitwise AND assignment operator. The result, 2, is then printed to the console.

10. Bitwise OR Assignment Operator (|=)

The bitwise OR assignment operator is used to perform a bitwise OR operation on the binary representation of a variable and a value, and store the result in the same variable. The syntax for the bitwise OR assignment operator is:

Here, the variable on the left-hand side is ORed with the value on the right-hand side using the bitwise OR operator, and the result is stored back in the variable on the left-hand side. For example,

Explanation: In this case, the value of a is ORed with 3 using the bitwise OR assignment operator. The result, 7, is then printed to the console.

11. Bitwise XOR Assignment Operator (^=)

The bitwise XOR assignment operator is used to perform a bitwise XOR operation on the binary representation of a variable and a value, and store the result in the same variable. The syntax for the bitwise XOR assignment operator is:

Here, the variable on the left-hand side is XORed with the value on the right-hand side using the bitwise XOR operator, and the result are stored back in the variable on the left-hand side. For example,

Explanation: In this case, the value of a is XORed with 3 using the bitwise XOR assignment operator. The result, 5, is then printed to the console.

12. Bitwise Right Shift Assignment Operator (>>=)

The bitwise right shift assignment operator is used to shift the bits of a variable to the right by a specified number of positions, and store the result in the same variable. The syntax for the bitwise right shift assignment operator is:

Here, the variable on the left-hand side has its bits shifted to the right by the number of positions specified by the value on the right-hand side, and the result is stored back in the variable on the left-hand side. For example,

Explanation: In this case, the value of a is shifted 2 positions to the right using the bitwise right shift assignment operator. The result, 2, is then printed to the console.

13. Bitwise Left Shift Assignment Operator (<<=)

The bitwise left shift assignment operator is used to shift the bits of a variable to the left by a specified number of positions, and store the result in the same variable. The syntax for the bitwise left shift assignment operator is:

Here, the variable on the left-hand side has its bits shifted to the left by the number of positions specified by the value on the right-hand side, and the result is stored back in the variable on the left-hand side. For example,

Conclusion Assignment operator in Python is used to assign values to variables, and it comes in different types. The simple assignment operator (=) assigns a value to a variable. The augmented assignment operators (+=, -=, *=, /=, %=, &=, |=, ^=, >>=, <<=) perform a specified operation and assign the result to the same variable in one step. The modulus assignment operator (%) calculates the remainder of a division operation and assigns the result to the same variable. The bitwise assignment operators (&=, |=, ^=, >>=, <<=) perform bitwise operations and assign the result to the same variable. The bitwise right shift assignment operator (>>=) shifts the bits of a variable to the right by a specified number of positions and stores the result in the same variable. The bitwise left shift assignment operator (<<=) shifts the bits of a variable to the left by a specified number of positions and stores the result in the same variable. These operators are useful in simplifying and shortening code that involves assigning and manipulating values in a single step.

Here are some Frequently Asked Questions on Assignment Operator in Python:

Q1 – Can I use the assignment operator to assign multiple values to multiple variables at once? Ans – Yes, you can use the assignment operator to assign multiple values to multiple variables at once, separated by commas. For example, "x, y, z = 1, 2, 3" would assign the value 1 to x, 2 to y, and 3 to z.

Q2 – Is it possible to chain assignment operators in Python? Ans – Yes, you can chain assignment operators in Python to perform multiple operations in one line of code. For example, "x = y = z = 1" would assign the value 1 to all three variables.

Q3 – How do I perform a conditional assignment in Python? Ans – To perform a conditional assignment in Python, you can use the ternary operator. For example, "x = a (if a > b) else b" would assign the value of a to x if a is greater than b, otherwise it would assign the value of b to x.

Q4 – What happens if I use an undefined variable in an assignment operation in Python? Ans – If you use an undefined variable in an assignment operation in Python, you will get a NameError. Make sure you have defined the variable before trying to assign a value to it.

Q5 – Can I use assignment operators with non-numeric data types in Python? Ans – Yes, you can use assignment operators with non-numeric data types in Python, such as strings or lists. For example, "my_list += [4, 5, 6]" would append the values 4, 5, and 6 to the end of the list named my_list.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Linked List
  • Segment Tree
  • Backtracking
  • Dynamic Programming
  • Greedy Algorithm
  • Operating System
  • Company Placement
  • Interview Tips
  • General Interview Questions
  • Data Structure
  • Other Topics
  • Computational Geometry
  • Game Theory

Related Post

Python list functions & python list methods, python interview questions, namespaces and scope in python, what is the difference between append and extend in python, python program to check for the perfect square, python program to find the sum of first n natural numbers.

Variables in Python

Variables in Python

Table of Contents

Variable Assignment

Variable types in python, object references, object identity, variable names, reserved words (keywords).

Watch Now This tutorial has a related video course created by the Real Python team. Watch it together with the written tutorial to deepen your understanding: Variables in Python

In the previous tutorial on Basic Data Types in Python , you saw how values of various Python data types can be created. But so far, all the values shown have been literal or constant values:

If you’re writing more complex code, your program will need data that can change as program execution proceeds.

Here’s what you’ll learn in this tutorial: You will learn how every item of data in a Python program can be described by the abstract term object , and you’ll learn how to manipulate objects using symbolic names called variables .

Free PDF Download: Python 3 Cheat Sheet

Take the Quiz: Test your knowledge with our interactive “Python Variables” quiz. You’ll receive a score upon completion to help you track your learning progress:

Interactive Quiz

Test your understanding of Python variables and object references.

Think of a variable as a name attached to a particular object. In Python, variables need not be declared or defined in advance, as is the case in many other programming languages. To create a variable, you just assign it a value and then start using it. Assignment is done with a single equals sign ( = ):

This is read or interpreted as “ n is assigned the value 300 .” Once this is done, n can be used in a statement or expression, and its value will be substituted:

Just as a literal value can be displayed directly from the interpreter prompt in a REPL session without the need for print() , so can a variable:

Later, if you change the value of n and use it again, the new value will be substituted instead:

Python also allows chained assignment, which makes it possible to assign the same value to several variables simultaneously:

The chained assignment above assigns 300 to the variables a , b , and c simultaneously.

In many programming languages, variables are statically typed. That means a variable is initially declared to have a specific data type, and any value assigned to it during its lifetime must always have that type.

Variables in Python are not subject to this restriction. In Python, a variable may be assigned a value of one type and then later re-assigned a value of a different type:

What is actually happening when you make a variable assignment? This is an important question in Python, because the answer differs somewhat from what you’d find in many other programming languages.

Python is a highly object-oriented language . In fact, virtually every item of data in a Python program is an object of a specific type or class. (This point will be reiterated many times over the course of these tutorials.)

Consider this code:

When presented with the statement print(300) , the interpreter does the following:

  • Creates an integer object
  • Gives it the value 300
  • Displays it to the console

You can see that an integer object is created using the built-in type() function:

A Python variable is a symbolic name that is a reference or pointer to an object. Once an object is assigned to a variable, you can refer to the object by that name. But the data itself is still contained within the object.

For example:

This assignment creates an integer object with the value 300 and assigns the variable n to point to that object.

Variable reference diagram

The following code verifies that n points to an integer object:

Now consider the following statement:

What happens when it is executed? Python does not create another object. It simply creates a new symbolic name or reference, m , which points to the same object that n points to.

Python variable references to the same object (illustration)

Next, suppose you do this:

Now Python creates a new integer object with the value 400 , and m becomes a reference to it.

References to separate objects in Python (diagram)

Lastly, suppose this statement is executed next:

Now Python creates a string object with the value "foo" and makes n reference that.

Python variable reference illustration

There is no longer any reference to the integer object 300 . It is orphaned, and there is no way to access it.

Tutorials in this series will occasionally refer to the lifetime of an object. An object’s life begins when it is created, at which time at least one reference to it is created. During an object’s lifetime, additional references to it may be created, as you saw above, and references to it may be deleted as well. An object stays alive, as it were, so long as there is at least one reference to it.

When the number of references to an object drops to zero, it is no longer accessible. At that point, its lifetime is over. Python will eventually notice that it is inaccessible and reclaim the allocated memory so it can be used for something else. In computer lingo, this process is referred to as garbage collection .

In Python, every object that is created is given a number that uniquely identifies it. It is guaranteed that no two objects will have the same identifier during any period in which their lifetimes overlap. Once an object’s reference count drops to zero and it is garbage collected, as happened to the 300 object above, then its identifying number becomes available and may be used again.

The built-in Python function id() returns an object’s integer identifier. Using the id() function, you can verify that two variables indeed point to the same object:

After the assignment m = n , m and n both point to the same object, confirmed by the fact that id(m) and id(n) return the same number. Once m is reassigned to 400 , m and n point to different objects with different identities.

Deep Dive: Caching Small Integer Values From what you now know about variable assignment and object references in Python, the following probably won’t surprise you: Python >>> m = 300 >>> n = 300 >>> id ( m ) 60062304 >>> id ( n ) 60062896 Copied! With the statement m = 300 , Python creates an integer object with the value 300 and sets m as a reference to it. n is then similarly assigned to an integer object with value 300 —but not the same object. Thus, they have different identities, which you can verify from the values returned by id() . But consider this: Python >>> m = 30 >>> n = 30 >>> id ( m ) 1405569120 >>> id ( n ) 1405569120 Copied! Here, m and n are separately assigned to integer objects having value 30 . But in this case, id(m) and id(n) are identical! For purposes of optimization, the interpreter creates objects for the integers in the range [-5, 256] at startup, and then reuses them during program execution. Thus, when you assign separate variables to an integer value in this range, they will actually reference the same object.

The examples you have seen so far have used short, terse variable names like m and n . But variable names can be more verbose. In fact, it is usually beneficial if they are because it makes the purpose of the variable more evident at first glance.

Officially, variable names in Python can be any length and can consist of uppercase and lowercase letters ( A-Z , a-z ), digits ( 0-9 ), and the underscore character ( _ ). An additional restriction is that, although a variable name can contain digits, the first character of a variable name cannot be a digit.

Note: One of the additions to Python 3 was full Unicode support , which allows for Unicode characters in a variable name as well. You will learn about Unicode in greater depth in a future tutorial.

For example, all of the following are valid variable names:

But this one is not, because a variable name can’t begin with a digit:

Note that case is significant. Lowercase and uppercase letters are not the same. Use of the underscore character is significant as well. Each of the following defines a different variable:

There is nothing stopping you from creating two different variables in the same program called age and Age , or for that matter agE . But it is probably ill-advised. It would certainly be likely to confuse anyone trying to read your code, and even you yourself, after you’d been away from it awhile.

It is worthwhile to give a variable a name that is descriptive enough to make clear what it is being used for. For example, suppose you are tallying the number of people who have graduated college. You could conceivably choose any of the following:

All of them are probably better choices than n , or ncg , or the like. At least you can tell from the name what the value of the variable is supposed to represent.

On the other hand, they aren’t all necessarily equally legible. As with many things, it is a matter of personal preference, but most people would find the first two examples, where the letters are all shoved together, to be harder to read, particularly the one in all capital letters. The most commonly used methods of constructing a multi-word variable name are the last three examples:

  • Example: numberOfCollegeGraduates
  • Example: NumberOfCollegeGraduates
  • Example: number_of_college_graduates

Programmers debate hotly, with surprising fervor, which of these is preferable. Decent arguments can be made for all of them. Use whichever of the three is most visually appealing to you. Pick one and use it consistently.

You will see later that variables aren’t the only things that can be given names. You can also name functions, classes, modules, and so on. The rules that apply to variable names also apply to identifiers, the more general term for names given to program objects.

The Style Guide for Python Code , also known as PEP 8 , contains Naming Conventions that list suggested standards for names of different object types. PEP 8 includes the following recommendations:

  • Snake Case should be used for functions and variable names.
  • Pascal Case should be used for class names. (PEP 8 refers to this as the “CapWords” convention.)

There is one more restriction on identifier names. The Python language reserves a small set of keywords that designate special language functionality. No object can have the same name as a reserved word.

In Python 3.6, there are 33 reserved keywords:

Python
Keywords
     

You can see this list any time by typing help("keywords") to the Python interpreter. Reserved words are case-sensitive and must be used exactly as shown. They are all entirely lowercase, except for False , None , and True .

Trying to create a variable with the same name as any reserved word results in an error:

This tutorial covered the basics of Python variables , including object references and identity, and naming of Python identifiers.

You now have a good understanding of some of Python’s data types and know how to create variables that reference objects of those types.

Next, you will see how to combine data objects into expressions involving various operations .

🐍 Python Tricks 💌

Get a short & sweet Python Trick delivered to your inbox every couple of days. No spam ever. Unsubscribe any time. Curated by the Real Python team.

Python Tricks Dictionary Merge

About John Sturtz

John Sturtz

John is an avid Pythonista and a member of the Real Python tutorial team.

Each tutorial at Real Python is created by a team of developers so that it meets our high quality standards. The team members who worked on this tutorial are:

Aldren Santos

Master Real-World Python Skills With Unlimited Access to Real Python

Join us and get access to thousands of tutorials, hands-on video courses, and a community of expert Pythonistas:

Join us and get access to thousands of tutorials, hands-on video courses, and a community of expert Pythonistas:

What Do You Think?

What’s your #1 takeaway or favorite thing you learned? How are you going to put your newfound skills to use? Leave a comment below and let us know.

Commenting Tips: The most useful comments are those written with the goal of learning from or helping out other students. Get tips for asking good questions and get answers to common questions in our support portal . Looking for a real-time conversation? Visit the Real Python Community Chat or join the next “Office Hours” Live Q&A Session . Happy Pythoning!

Keep Learning

Related Topics: basics python

Recommended Video Course: Variables in Python

Keep reading Real Python by creating a free account or signing in:

Already have an account? Sign-In

Almost there! Complete this form and click the button below to gain instant access:

Python Logo

Python 3 Cheat Sheet (PDF)

🔒 No spam. We take your privacy seriously.

assignment define python

Python Enhancement Proposals

  • Python »
  • PEP Index »

PEP 572 – Assignment Expressions

The importance of real code, exceptional cases, scope of the target, relative precedence of :=, change to evaluation order, differences between assignment expressions and assignment statements, specification changes during implementation, _pydecimal.py, datetime.py, sysconfig.py, simplifying list comprehensions, capturing condition values, changing the scope rules for comprehensions, alternative spellings, special-casing conditional statements, special-casing comprehensions, lowering operator precedence, allowing commas to the right, always requiring parentheses, why not just turn existing assignment into an expression, with assignment expressions, why bother with assignment statements, why not use a sublocal scope and prevent namespace pollution, style guide recommendations, acknowledgements, a numeric example, appendix b: rough code translations for comprehensions, appendix c: no changes to scope semantics.

This is a proposal for creating a way to assign to variables within an expression using the notation NAME := expr .

As part of this change, there is also an update to dictionary comprehension evaluation order to ensure key expressions are executed before value expressions (allowing the key to be bound to a name and then re-used as part of calculating the corresponding value).

During discussion of this PEP, the operator became informally known as “the walrus operator”. The construct’s formal name is “Assignment Expressions” (as per the PEP title), but they may also be referred to as “Named Expressions” (e.g. the CPython reference implementation uses that name internally).

Naming the result of an expression is an important part of programming, allowing a descriptive name to be used in place of a longer expression, and permitting reuse. Currently, this feature is available only in statement form, making it unavailable in list comprehensions and other expression contexts.

Additionally, naming sub-parts of a large expression can assist an interactive debugger, providing useful display hooks and partial results. Without a way to capture sub-expressions inline, this would require refactoring of the original code; with assignment expressions, this merely requires the insertion of a few name := markers. Removing the need to refactor reduces the likelihood that the code be inadvertently changed as part of debugging (a common cause of Heisenbugs), and is easier to dictate to another programmer.

During the development of this PEP many people (supporters and critics both) have had a tendency to focus on toy examples on the one hand, and on overly complex examples on the other.

The danger of toy examples is twofold: they are often too abstract to make anyone go “ooh, that’s compelling”, and they are easily refuted with “I would never write it that way anyway”.

The danger of overly complex examples is that they provide a convenient strawman for critics of the proposal to shoot down (“that’s obfuscated”).

Yet there is some use for both extremely simple and extremely complex examples: they are helpful to clarify the intended semantics. Therefore, there will be some of each below.

However, in order to be compelling , examples should be rooted in real code, i.e. code that was written without any thought of this PEP, as part of a useful application, however large or small. Tim Peters has been extremely helpful by going over his own personal code repository and picking examples of code he had written that (in his view) would have been clearer if rewritten with (sparing) use of assignment expressions. His conclusion: the current proposal would have allowed a modest but clear improvement in quite a few bits of code.

Another use of real code is to observe indirectly how much value programmers place on compactness. Guido van Rossum searched through a Dropbox code base and discovered some evidence that programmers value writing fewer lines over shorter lines.

Case in point: Guido found several examples where a programmer repeated a subexpression, slowing down the program, in order to save one line of code, e.g. instead of writing:

they would write:

Another example illustrates that programmers sometimes do more work to save an extra level of indentation:

This code tries to match pattern2 even if pattern1 has a match (in which case the match on pattern2 is never used). The more efficient rewrite would have been:

Syntax and semantics

In most contexts where arbitrary Python expressions can be used, a named expression can appear. This is of the form NAME := expr where expr is any valid Python expression other than an unparenthesized tuple, and NAME is an identifier.

The value of such a named expression is the same as the incorporated expression, with the additional side-effect that the target is assigned that value:

There are a few places where assignment expressions are not allowed, in order to avoid ambiguities or user confusion:

This rule is included to simplify the choice for the user between an assignment statement and an assignment expression – there is no syntactic position where both are valid.

Again, this rule is included to avoid two visually similar ways of saying the same thing.

This rule is included to disallow excessively confusing code, and because parsing keyword arguments is complex enough already.

This rule is included to discourage side effects in a position whose exact semantics are already confusing to many users (cf. the common style recommendation against mutable default values), and also to echo the similar prohibition in calls (the previous bullet).

The reasoning here is similar to the two previous cases; this ungrouped assortment of symbols and operators composed of : and = is hard to read correctly.

This allows lambda to always bind less tightly than := ; having a name binding at the top level inside a lambda function is unlikely to be of value, as there is no way to make use of it. In cases where the name will be used more than once, the expression is likely to need parenthesizing anyway, so this prohibition will rarely affect code.

This shows that what looks like an assignment operator in an f-string is not always an assignment operator. The f-string parser uses : to indicate formatting options. To preserve backwards compatibility, assignment operator usage inside of f-strings must be parenthesized. As noted above, this usage of the assignment operator is not recommended.

An assignment expression does not introduce a new scope. In most cases the scope in which the target will be bound is self-explanatory: it is the current scope. If this scope contains a nonlocal or global declaration for the target, the assignment expression honors that. A lambda (being an explicit, if anonymous, function definition) counts as a scope for this purpose.

There is one special case: an assignment expression occurring in a list, set or dict comprehension or in a generator expression (below collectively referred to as “comprehensions”) binds the target in the containing scope, honoring a nonlocal or global declaration for the target in that scope, if one exists. For the purpose of this rule the containing scope of a nested comprehension is the scope that contains the outermost comprehension. A lambda counts as a containing scope.

The motivation for this special case is twofold. First, it allows us to conveniently capture a “witness” for an any() expression, or a counterexample for all() , for example:

Second, it allows a compact way of updating mutable state from a comprehension, for example:

However, an assignment expression target name cannot be the same as a for -target name appearing in any comprehension containing the assignment expression. The latter names are local to the comprehension in which they appear, so it would be contradictory for a contained use of the same name to refer to the scope containing the outermost comprehension instead.

For example, [i := i+1 for i in range(5)] is invalid: the for i part establishes that i is local to the comprehension, but the i := part insists that i is not local to the comprehension. The same reason makes these examples invalid too:

While it’s technically possible to assign consistent semantics to these cases, it’s difficult to determine whether those semantics actually make sense in the absence of real use cases. Accordingly, the reference implementation [1] will ensure that such cases raise SyntaxError , rather than executing with implementation defined behaviour.

This restriction applies even if the assignment expression is never executed:

For the comprehension body (the part before the first “for” keyword) and the filter expression (the part after “if” and before any nested “for”), this restriction applies solely to target names that are also used as iteration variables in the comprehension. Lambda expressions appearing in these positions introduce a new explicit function scope, and hence may use assignment expressions with no additional restrictions.

Due to design constraints in the reference implementation (the symbol table analyser cannot easily detect when names are re-used between the leftmost comprehension iterable expression and the rest of the comprehension), named expressions are disallowed entirely as part of comprehension iterable expressions (the part after each “in”, and before any subsequent “if” or “for” keyword):

A further exception applies when an assignment expression occurs in a comprehension whose containing scope is a class scope. If the rules above were to result in the target being assigned in that class’s scope, the assignment expression is expressly invalid. This case also raises SyntaxError :

(The reason for the latter exception is the implicit function scope created for comprehensions – there is currently no runtime mechanism for a function to refer to a variable in the containing class scope, and we do not want to add such a mechanism. If this issue ever gets resolved this special case may be removed from the specification of assignment expressions. Note that the problem already exists for using a variable defined in the class scope from a comprehension.)

See Appendix B for some examples of how the rules for targets in comprehensions translate to equivalent code.

The := operator groups more tightly than a comma in all syntactic positions where it is legal, but less tightly than all other operators, including or , and , not , and conditional expressions ( A if C else B ). As follows from section “Exceptional cases” above, it is never allowed at the same level as = . In case a different grouping is desired, parentheses should be used.

The := operator may be used directly in a positional function call argument; however it is invalid directly in a keyword argument.

Some examples to clarify what’s technically valid or invalid:

Most of the “valid” examples above are not recommended, since human readers of Python source code who are quickly glancing at some code may miss the distinction. But simple cases are not objectionable:

This PEP recommends always putting spaces around := , similar to PEP 8 ’s recommendation for = when used for assignment, whereas the latter disallows spaces around = used for keyword arguments.)

In order to have precisely defined semantics, the proposal requires evaluation order to be well-defined. This is technically not a new requirement, as function calls may already have side effects. Python already has a rule that subexpressions are generally evaluated from left to right. However, assignment expressions make these side effects more visible, and we propose a single change to the current evaluation order:

  • In a dict comprehension {X: Y for ...} , Y is currently evaluated before X . We propose to change this so that X is evaluated before Y . (In a dict display like {X: Y} this is already the case, and also in dict((X, Y) for ...) which should clearly be equivalent to the dict comprehension.)

Most importantly, since := is an expression, it can be used in contexts where statements are illegal, including lambda functions and comprehensions.

Conversely, assignment expressions don’t support the advanced features found in assignment statements:

  • Multiple targets are not directly supported: x = y = z = 0 # Equivalent: (z := (y := (x := 0)))
  • Single assignment targets other than a single NAME are not supported: # No equivalent a [ i ] = x self . rest = []
  • Priority around commas is different: x = 1 , 2 # Sets x to (1, 2) ( x := 1 , 2 ) # Sets x to 1
  • Iterable packing and unpacking (both regular or extended forms) are not supported: # Equivalent needs extra parentheses loc = x , y # Use (loc := (x, y)) info = name , phone , * rest # Use (info := (name, phone, *rest)) # No equivalent px , py , pz = position name , phone , email , * other_info = contact
  • Inline type annotations are not supported: # Closest equivalent is "p: Optional[int]" as a separate declaration p : Optional [ int ] = None
  • Augmented assignment is not supported: total += tax # Equivalent: (total := total + tax)

The following changes have been made based on implementation experience and additional review after the PEP was first accepted and before Python 3.8 was released:

  • for consistency with other similar exceptions, and to avoid locking in an exception name that is not necessarily going to improve clarity for end users, the originally proposed TargetScopeError subclass of SyntaxError was dropped in favour of just raising SyntaxError directly. [3]
  • due to a limitation in CPython’s symbol table analysis process, the reference implementation raises SyntaxError for all uses of named expressions inside comprehension iterable expressions, rather than only raising them when the named expression target conflicts with one of the iteration variables in the comprehension. This could be revisited given sufficiently compelling examples, but the extra complexity needed to implement the more selective restriction doesn’t seem worthwhile for purely hypothetical use cases.

Examples from the Python standard library

env_base is only used on these lines, putting its assignment on the if moves it as the “header” of the block.

  • Current: env_base = os . environ . get ( "PYTHONUSERBASE" , None ) if env_base : return env_base
  • Improved: if env_base := os . environ . get ( "PYTHONUSERBASE" , None ): return env_base

Avoid nested if and remove one indentation level.

  • Current: if self . _is_special : ans = self . _check_nans ( context = context ) if ans : return ans
  • Improved: if self . _is_special and ( ans := self . _check_nans ( context = context )): return ans

Code looks more regular and avoid multiple nested if. (See Appendix A for the origin of this example.)

  • Current: reductor = dispatch_table . get ( cls ) if reductor : rv = reductor ( x ) else : reductor = getattr ( x , "__reduce_ex__" , None ) if reductor : rv = reductor ( 4 ) else : reductor = getattr ( x , "__reduce__" , None ) if reductor : rv = reductor () else : raise Error ( "un(deep)copyable object of type %s " % cls )
  • Improved: if reductor := dispatch_table . get ( cls ): rv = reductor ( x ) elif reductor := getattr ( x , "__reduce_ex__" , None ): rv = reductor ( 4 ) elif reductor := getattr ( x , "__reduce__" , None ): rv = reductor () else : raise Error ( "un(deep)copyable object of type %s " % cls )

tz is only used for s += tz , moving its assignment inside the if helps to show its scope.

  • Current: s = _format_time ( self . _hour , self . _minute , self . _second , self . _microsecond , timespec ) tz = self . _tzstr () if tz : s += tz return s
  • Improved: s = _format_time ( self . _hour , self . _minute , self . _second , self . _microsecond , timespec ) if tz := self . _tzstr (): s += tz return s

Calling fp.readline() in the while condition and calling .match() on the if lines make the code more compact without making it harder to understand.

  • Current: while True : line = fp . readline () if not line : break m = define_rx . match ( line ) if m : n , v = m . group ( 1 , 2 ) try : v = int ( v ) except ValueError : pass vars [ n ] = v else : m = undef_rx . match ( line ) if m : vars [ m . group ( 1 )] = 0
  • Improved: while line := fp . readline (): if m := define_rx . match ( line ): n , v = m . group ( 1 , 2 ) try : v = int ( v ) except ValueError : pass vars [ n ] = v elif m := undef_rx . match ( line ): vars [ m . group ( 1 )] = 0

A list comprehension can map and filter efficiently by capturing the condition:

Similarly, a subexpression can be reused within the main expression, by giving it a name on first use:

Note that in both cases the variable y is bound in the containing scope (i.e. at the same level as results or stuff ).

Assignment expressions can be used to good effect in the header of an if or while statement:

Particularly with the while loop, this can remove the need to have an infinite loop, an assignment, and a condition. It also creates a smooth parallel between a loop which simply uses a function call as its condition, and one which uses that as its condition but also uses the actual value.

An example from the low-level UNIX world:

Rejected alternative proposals

Proposals broadly similar to this one have come up frequently on python-ideas. Below are a number of alternative syntaxes, some of them specific to comprehensions, which have been rejected in favour of the one given above.

A previous version of this PEP proposed subtle changes to the scope rules for comprehensions, to make them more usable in class scope and to unify the scope of the “outermost iterable” and the rest of the comprehension. However, this part of the proposal would have caused backwards incompatibilities, and has been withdrawn so the PEP can focus on assignment expressions.

Broadly the same semantics as the current proposal, but spelled differently.

Since EXPR as NAME already has meaning in import , except and with statements (with different semantics), this would create unnecessary confusion or require special-casing (e.g. to forbid assignment within the headers of these statements).

(Note that with EXPR as VAR does not simply assign the value of EXPR to VAR – it calls EXPR.__enter__() and assigns the result of that to VAR .)

Additional reasons to prefer := over this spelling include:

  • In if f(x) as y the assignment target doesn’t jump out at you – it just reads like if f x blah blah and it is too similar visually to if f(x) and y .
  • import foo as bar
  • except Exc as var
  • with ctxmgr() as var

To the contrary, the assignment expression does not belong to the if or while that starts the line, and we intentionally allow assignment expressions in other contexts as well.

  • NAME = EXPR
  • if NAME := EXPR

reinforces the visual recognition of assignment expressions.

This syntax is inspired by languages such as R and Haskell, and some programmable calculators. (Note that a left-facing arrow y <- f(x) is not possible in Python, as it would be interpreted as less-than and unary minus.) This syntax has a slight advantage over ‘as’ in that it does not conflict with with , except and import , but otherwise is equivalent. But it is entirely unrelated to Python’s other use of -> (function return type annotations), and compared to := (which dates back to Algol-58) it has a much weaker tradition.

This has the advantage that leaked usage can be readily detected, removing some forms of syntactic ambiguity. However, this would be the only place in Python where a variable’s scope is encoded into its name, making refactoring harder.

Execution order is inverted (the indented body is performed first, followed by the “header”). This requires a new keyword, unless an existing keyword is repurposed (most likely with: ). See PEP 3150 for prior discussion on this subject (with the proposed keyword being given: ).

This syntax has fewer conflicts than as does (conflicting only with the raise Exc from Exc notation), but is otherwise comparable to it. Instead of paralleling with expr as target: (which can be useful but can also be confusing), this has no parallels, but is evocative.

One of the most popular use-cases is if and while statements. Instead of a more general solution, this proposal enhances the syntax of these two statements to add a means of capturing the compared value:

This works beautifully if and ONLY if the desired condition is based on the truthiness of the captured value. It is thus effective for specific use-cases (regex matches, socket reads that return '' when done), and completely useless in more complicated cases (e.g. where the condition is f(x) < 0 and you want to capture the value of f(x) ). It also has no benefit to list comprehensions.

Advantages: No syntactic ambiguities. Disadvantages: Answers only a fraction of possible use-cases, even in if / while statements.

Another common use-case is comprehensions (list/set/dict, and genexps). As above, proposals have been made for comprehension-specific solutions.

This brings the subexpression to a location in between the ‘for’ loop and the expression. It introduces an additional language keyword, which creates conflicts. Of the three, where reads the most cleanly, but also has the greatest potential for conflict (e.g. SQLAlchemy and numpy have where methods, as does tkinter.dnd.Icon in the standard library).

As above, but reusing the with keyword. Doesn’t read too badly, and needs no additional language keyword. Is restricted to comprehensions, though, and cannot as easily be transformed into “longhand” for-loop syntax. Has the C problem that an equals sign in an expression can now create a name binding, rather than performing a comparison. Would raise the question of why “with NAME = EXPR:” cannot be used as a statement on its own.

As per option 2, but using as rather than an equals sign. Aligns syntactically with other uses of as for name binding, but a simple transformation to for-loop longhand would create drastically different semantics; the meaning of with inside a comprehension would be completely different from the meaning as a stand-alone statement, while retaining identical syntax.

Regardless of the spelling chosen, this introduces a stark difference between comprehensions and the equivalent unrolled long-hand form of the loop. It is no longer possible to unwrap the loop into statement form without reworking any name bindings. The only keyword that can be repurposed to this task is with , thus giving it sneakily different semantics in a comprehension than in a statement; alternatively, a new keyword is needed, with all the costs therein.

There are two logical precedences for the := operator. Either it should bind as loosely as possible, as does statement-assignment; or it should bind more tightly than comparison operators. Placing its precedence between the comparison and arithmetic operators (to be precise: just lower than bitwise OR) allows most uses inside while and if conditions to be spelled without parentheses, as it is most likely that you wish to capture the value of something, then perform a comparison on it:

Once find() returns -1, the loop terminates. If := binds as loosely as = does, this would capture the result of the comparison (generally either True or False ), which is less useful.

While this behaviour would be convenient in many situations, it is also harder to explain than “the := operator behaves just like the assignment statement”, and as such, the precedence for := has been made as close as possible to that of = (with the exception that it binds tighter than comma).

Some critics have claimed that the assignment expressions should allow unparenthesized tuples on the right, so that these two would be equivalent:

(With the current version of the proposal, the latter would be equivalent to ((point := x), y) .)

However, adopting this stance would logically lead to the conclusion that when used in a function call, assignment expressions also bind less tight than comma, so we’d have the following confusing equivalence:

The less confusing option is to make := bind more tightly than comma.

It’s been proposed to just always require parentheses around an assignment expression. This would resolve many ambiguities, and indeed parentheses will frequently be needed to extract the desired subexpression. But in the following cases the extra parentheses feel redundant:

Frequently Raised Objections

C and its derivatives define the = operator as an expression, rather than a statement as is Python’s way. This allows assignments in more contexts, including contexts where comparisons are more common. The syntactic similarity between if (x == y) and if (x = y) belies their drastically different semantics. Thus this proposal uses := to clarify the distinction.

The two forms have different flexibilities. The := operator can be used inside a larger expression; the = statement can be augmented to += and its friends, can be chained, and can assign to attributes and subscripts.

Previous revisions of this proposal involved sublocal scope (restricted to a single statement), preventing name leakage and namespace pollution. While a definite advantage in a number of situations, this increases complexity in many others, and the costs are not justified by the benefits. In the interests of language simplicity, the name bindings created here are exactly equivalent to any other name bindings, including that usage at class or module scope will create externally-visible names. This is no different from for loops or other constructs, and can be solved the same way: del the name once it is no longer needed, or prefix it with an underscore.

(The author wishes to thank Guido van Rossum and Christoph Groth for their suggestions to move the proposal in this direction. [2] )

As expression assignments can sometimes be used equivalently to statement assignments, the question of which should be preferred will arise. For the benefit of style guides such as PEP 8 , two recommendations are suggested.

  • If either assignment statements or assignment expressions can be used, prefer statements; they are a clear declaration of intent.
  • If using assignment expressions would lead to ambiguity about execution order, restructure it to use statements instead.

The authors wish to thank Alyssa Coghlan and Steven D’Aprano for their considerable contributions to this proposal, and members of the core-mentorship mailing list for assistance with implementation.

Appendix A: Tim Peters’s findings

Here’s a brief essay Tim Peters wrote on the topic.

I dislike “busy” lines of code, and also dislike putting conceptually unrelated logic on a single line. So, for example, instead of:

instead. So I suspected I’d find few places I’d want to use assignment expressions. I didn’t even consider them for lines already stretching halfway across the screen. In other cases, “unrelated” ruled:

is a vast improvement over the briefer:

The original two statements are doing entirely different conceptual things, and slamming them together is conceptually insane.

In other cases, combining related logic made it harder to understand, such as rewriting:

as the briefer:

The while test there is too subtle, crucially relying on strict left-to-right evaluation in a non-short-circuiting or method-chaining context. My brain isn’t wired that way.

But cases like that were rare. Name binding is very frequent, and “sparse is better than dense” does not mean “almost empty is better than sparse”. For example, I have many functions that return None or 0 to communicate “I have nothing useful to return in this case, but since that’s expected often I’m not going to annoy you with an exception”. This is essentially the same as regular expression search functions returning None when there is no match. So there was lots of code of the form:

I find that clearer, and certainly a bit less typing and pattern-matching reading, as:

It’s also nice to trade away a small amount of horizontal whitespace to get another _line_ of surrounding code on screen. I didn’t give much weight to this at first, but it was so very frequent it added up, and I soon enough became annoyed that I couldn’t actually run the briefer code. That surprised me!

There are other cases where assignment expressions really shine. Rather than pick another from my code, Kirill Balunov gave a lovely example from the standard library’s copy() function in copy.py :

The ever-increasing indentation is semantically misleading: the logic is conceptually flat, “the first test that succeeds wins”:

Using easy assignment expressions allows the visual structure of the code to emphasize the conceptual flatness of the logic; ever-increasing indentation obscured it.

A smaller example from my code delighted me, both allowing to put inherently related logic in a single line, and allowing to remove an annoying “artificial” indentation level:

That if is about as long as I want my lines to get, but remains easy to follow.

So, in all, in most lines binding a name, I wouldn’t use assignment expressions, but because that construct is so very frequent, that leaves many places I would. In most of the latter, I found a small win that adds up due to how often it occurs, and in the rest I found a moderate to major win. I’d certainly use it more often than ternary if , but significantly less often than augmented assignment.

I have another example that quite impressed me at the time.

Where all variables are positive integers, and a is at least as large as the n’th root of x, this algorithm returns the floor of the n’th root of x (and roughly doubling the number of accurate bits per iteration):

It’s not obvious why that works, but is no more obvious in the “loop and a half” form. It’s hard to prove correctness without building on the right insight (the “arithmetic mean - geometric mean inequality”), and knowing some non-trivial things about how nested floor functions behave. That is, the challenges are in the math, not really in the coding.

If you do know all that, then the assignment-expression form is easily read as “while the current guess is too large, get a smaller guess”, where the “too large?” test and the new guess share an expensive sub-expression.

To my eyes, the original form is harder to understand:

This appendix attempts to clarify (though not specify) the rules when a target occurs in a comprehension or in a generator expression. For a number of illustrative examples we show the original code, containing a comprehension, and the translation, where the comprehension has been replaced by an equivalent generator function plus some scaffolding.

Since [x for ...] is equivalent to list(x for ...) these examples all use list comprehensions without loss of generality. And since these examples are meant to clarify edge cases of the rules, they aren’t trying to look like real code.

Note: comprehensions are already implemented via synthesizing nested generator functions like those in this appendix. The new part is adding appropriate declarations to establish the intended scope of assignment expression targets (the same scope they resolve to as if the assignment were performed in the block containing the outermost comprehension). For type inference purposes, these illustrative expansions do not imply that assignment expression targets are always Optional (but they do indicate the target binding scope).

Let’s start with a reminder of what code is generated for a generator expression without assignment expression.

  • Original code (EXPR usually references VAR): def f (): a = [ EXPR for VAR in ITERABLE ]
  • Translation (let’s not worry about name conflicts): def f (): def genexpr ( iterator ): for VAR in iterator : yield EXPR a = list ( genexpr ( iter ( ITERABLE )))

Let’s add a simple assignment expression.

  • Original code: def f (): a = [ TARGET := EXPR for VAR in ITERABLE ]
  • Translation: def f (): if False : TARGET = None # Dead code to ensure TARGET is a local variable def genexpr ( iterator ): nonlocal TARGET for VAR in iterator : TARGET = EXPR yield TARGET a = list ( genexpr ( iter ( ITERABLE )))

Let’s add a global TARGET declaration in f() .

  • Original code: def f (): global TARGET a = [ TARGET := EXPR for VAR in ITERABLE ]
  • Translation: def f (): global TARGET def genexpr ( iterator ): global TARGET for VAR in iterator : TARGET = EXPR yield TARGET a = list ( genexpr ( iter ( ITERABLE )))

Or instead let’s add a nonlocal TARGET declaration in f() .

  • Original code: def g (): TARGET = ... def f (): nonlocal TARGET a = [ TARGET := EXPR for VAR in ITERABLE ]
  • Translation: def g (): TARGET = ... def f (): nonlocal TARGET def genexpr ( iterator ): nonlocal TARGET for VAR in iterator : TARGET = EXPR yield TARGET a = list ( genexpr ( iter ( ITERABLE )))

Finally, let’s nest two comprehensions.

  • Original code: def f (): a = [[ TARGET := i for i in range ( 3 )] for j in range ( 2 )] # I.e., a = [[0, 1, 2], [0, 1, 2]] print ( TARGET ) # prints 2
  • Translation: def f (): if False : TARGET = None def outer_genexpr ( outer_iterator ): nonlocal TARGET def inner_generator ( inner_iterator ): nonlocal TARGET for i in inner_iterator : TARGET = i yield i for j in outer_iterator : yield list ( inner_generator ( range ( 3 ))) a = list ( outer_genexpr ( range ( 2 ))) print ( TARGET )

Because it has been a point of confusion, note that nothing about Python’s scoping semantics is changed. Function-local scopes continue to be resolved at compile time, and to have indefinite temporal extent at run time (“full closures”). Example:

This document has been placed in the public domain.

Source: https://github.com/python/peps/blob/main/peps/pep-0572.rst

Last modified: 2023-10-11 12:05:51 GMT

Multiple assignment in Python: Assign multiple values or the same value to multiple variables

In Python, the = operator is used to assign values to variables.

You can assign values to multiple variables in one line.

Assign multiple values to multiple variables

Assign the same value to multiple variables.

You can assign multiple values to multiple variables by separating them with commas , .

You can assign values to more than three variables, and it is also possible to assign values of different data types to those variables.

When only one variable is on the left side, values on the right side are assigned as a tuple to that variable.

If the number of variables on the left does not match the number of values on the right, a ValueError occurs. You can assign the remaining values as a list by prefixing the variable name with * .

For more information on using * and assigning elements of a tuple and list to multiple variables, see the following article.

  • Unpack a tuple and list in Python

You can also swap the values of multiple variables in the same way. See the following article for details:

  • Swap values ​​in a list or values of variables in Python

You can assign the same value to multiple variables by using = consecutively.

For example, this is useful when initializing multiple variables with the same value.

After assigning the same value, you can assign a different value to one of these variables. As described later, be cautious when assigning mutable objects such as list and dict .

You can apply the same method when assigning the same value to three or more variables.

Be careful when assigning mutable objects such as list and dict .

If you use = consecutively, the same object is assigned to all variables. Therefore, if you change the value of an element or add a new element in one variable, the changes will be reflected in the others as well.

If you want to handle mutable objects separately, you need to assign them individually.

after c = []; d = [] , c and d are guaranteed to refer to two different, unique, newly created empty lists. (Note that c = d = [] assigns the same object to both c and d .) 3. Data model — Python 3.11.3 documentation

You can also use copy() or deepcopy() from the copy module to make shallow and deep copies. See the following article.

  • Shallow and deep copy in Python: copy(), deepcopy()

Related Categories

Related articles.

  • NumPy: arange() and linspace() to generate evenly spaced values
  • Chained comparison (a < x < b) in Python
  • pandas: Get first/last n rows of DataFrame with head() and tail()
  • pandas: Filter rows/columns by labels with filter()
  • Get the filename, directory, extension from a path string in Python
  • Sign function in Python (sign/signum/sgn, copysign)
  • How to flatten a list of lists in Python
  • None in Python
  • Create calendar as text, HTML, list in Python
  • NumPy: Insert elements, rows, and columns into an array with np.insert()
  • Shuffle a list, string, tuple in Python (random.shuffle, sample)
  • Add and update an item in a dictionary in Python
  • Cartesian product of lists in Python (itertools.product)
  • Remove a substring from a string in Python
  • pandas: Extract rows that contain specific strings from a DataFrame

TutorialsTonight Logo

Python Conditional Assignment

When you want to assign a value to a variable based on some condition, like if the condition is true then assign a value to the variable, else assign some other value to the variable, then you can use the conditional assignment operator.

In this tutorial, we will look at different ways to assign values to a variable based on some condition.

1. Using Ternary Operator

The ternary operator is very special operator in Python, it is used to assign a value to a variable based on some condition.

It goes like this:

Here, the value of variable will be value_if_true if the condition is true, else it will be value_if_false .

Let's see a code snippet to understand it better.

You can see we have conditionally assigned a value to variable c based on the condition a > b .

2. Using if-else statement

if-else statements are the core part of any programming language, they are used to execute a block of code based on some condition.

Using an if-else statement, we can assign a value to a variable based on the condition we provide.

Here is an example of replacing the above code snippet with the if-else statement.

3. Using Logical Short Circuit Evaluation

Logical short circuit evaluation is another way using which you can assign a value to a variable conditionally.

The format of logical short circuit evaluation is:

It looks similar to ternary operator, but it is not. Here the condition and value_if_true performs logical AND operation, if both are true then the value of variable will be value_if_true , or else it will be value_if_false .

Let's see an example:

But if we make condition True but value_if_true False (or 0 or None), then the value of variable will be value_if_false .

So, you can see that the value of c is 20 even though the condition a < b is True .

So, you should be careful while using logical short circuit evaluation.

While working with lists , we often need to check if a list is empty or not, and if it is empty then we need to assign some default value to it.

Let's see how we can do it using conditional assignment.

Here, we have assigned a default value to my_list if it is empty.

Assign a value to a variable conditionally based on the presence of an element in a list.

Now you know 3 different ways to assign a value to a variable conditionally. Any of these methods can be used to assign a value when there is a condition.

The cleanest and fastest way to conditional value assignment is the ternary operator .

if-else statement is recommended to use when you have to execute a block of code based on some condition.

Happy coding! 😊

Learn Python practically and Get Certified .

Popular Tutorials

Popular examples, reference materials, learn python interactively, python introduction.

  • Get Started With Python
  • Your First Python Program
  • Python Comments

Python Fundamentals

  • Python Variables and Literals
  • Python Type Conversion
  • Python Basic Input and Output
  • Python Operators

Python Flow Control

  • Python if...else Statement
  • Python for Loop
  • Python while Loop
  • Python break and continue
  • Python pass Statement

Python Data types

  • Python Numbers and Mathematics
  • Python List
  • Python Tuple
  • Python String

Python Dictionary

  • Python Functions
  • Python Function Arguments
  • Python Variable Scope
  • Python Global Keyword
  • Python Recursion
  • Python Modules
  • Python Package
  • Python Main function

Python Files

  • Python Directory and Files Management
  • Python CSV: Read and Write CSV files
  • Reading CSV files in Python
  • Writing CSV files in Python
  • Python Exception Handling
  • Python Exceptions
  • Python Custom Exceptions

Python Object & Class

  • Python Objects and Classes
  • Python Inheritance
  • Python Multiple Inheritance
  • Polymorphism in Python
  • Python Operator Overloading

Python Advanced Topics

  • List comprehension
  • Python Lambda/Anonymous Function
  • Python Iterators
  • Python Generators
  • Python Namespace and Scope
  • Python Closures
  • Python Decorators
  • Python @property decorator
  • Python RegEx

Python Date and Time

  • Python datetime
  • Python strftime()
  • Python strptime()
  • How to get current date and time in Python?
  • Python Get Current Time
  • Python timestamp to datetime and vice-versa
  • Python time Module
  • Python sleep()

Additional Topic

  • Precedence and Associativity of Operators in Python
  • Python Keywords and Identifiers
  • Python Asserts
  • Python Json
  • Python *args and **kwargs

Python Tutorials

Python reversed()

Python Dictionary clear()

  • Python Dictionary items()
  • Python Dictionary keys()

Python Dictionary fromkeys()

  • Python Nested Dictionary

A Python dictionary is a collection of items, similar to lists and tuples. However, unlike lists and tuples, each item in a dictionary is a key-value pair (consisting of a key and a value).

  • Create a Dictionary

We create a dictionary by placing key: value pairs inside curly brackets {} , separated by commas. For example,

Key Value Pairs in a Dictionary

  • Dictionary keys must be immutable, such as tuples, strings, integers, etc. We cannot use mutable (changeable) objects such as lists as keys.
  • We can also create a dictionary using a Python built-in function dict() . To learn more, visit Python dict() .

Valid and Invalid Dictionaries

Immutable objects can't be changed once created. Some immutable objects in Python are integer, tuple and string.

In this example, we have used integers, tuples, and strings as keys for the dictionaries. When we used a list as a key, an error message occurred due to the list's mutable nature.

Note: Dictionary values can be of any data type, including mutable types like lists.

The keys of a dictionary must be unique. If there are duplicate keys, the later value of the key overwrites the previous value.

Here, the key Harry Potter is first assigned to Gryffindor . However, there is a second entry where Harry Potter is assigned to Slytherin .

As duplicate keys are not allowed in a dictionary, the last entry Slytherin overwrites the previous value Gryffindor .

  • Access Dictionary Items

We can access the value of a dictionary item by placing the key inside square brackets.

Note: We can also use the get() method to access dictionary items.

  • Add Items to a Dictionary

We can add an item to a dictionary by assigning a value to a new key. For example,

  • Remove Dictionary Items

We can use the del statement to remove an element from a dictionary. For example,

Note : We can also use the pop() method to remove an item from a dictionary.

If we need to remove all items from a dictionary at once, we can use the clear() method.

  • Change Dictionary Items

Python dictionaries are mutable (changeable). We can change the value of a dictionary element by referring to its key. For example,

Note : We can also use the update() method to add or change dictionary items.

  • Iterate Through a Dictionary

A dictionary is an ordered collection of items (starting from Python 3.7), therefore it maintains the order of its items.

We can iterate through dictionary keys one by one using a for loop .

  • Find Dictionary Length

We can find the length of a dictionary by using the len() function.

  • Python Dictionary Methods

Here are some of the commonly used dictionary methods .

Function Description
Removes the item with the specified key.
Adds or changes dictionary items.
Remove all the items from the dictionary.
Returns all the dictionary's keys.
Returns all the dictionary's values.
Returns the value of the specified key.
Returns the last inserted key and value as a tuple.
Returns a copy of the dictionary.
  • Dictionary Membership Test

We can check whether a key exists in a dictionary by using the in and not in operators.

Note: The in operator checks whether a key exists; it doesn't check whether a value exists or not.

Table of Contents

Before we wrap up, let’s put your knowledge of Python dictionary to the test! Can you solve the following challenge?

Write a function to merge two dictionaries.

  • Merge dict1 and dict2 , then return the merged dictionary.

Video: Python Dictionaries to Store key/value Pairs

Sorry about that.

Our premium learning platform, created with over a decade of experience and thousands of feedbacks .

Learn and improve your coding skills like never before.

  • Interactive Courses
  • Certificates
  • 2000+ Challenges

Related Tutorials

Python Library

Python Tutorial

Python Data Types

Exceptions ¶

Raised when an implicitly defined __setattr__() or __delattr__() is called on a dataclass which was defined with frozen=True . It is a subclass of AttributeError .

Proteomic changes in Alzheimer disease associated with progressive Aβ plaque and tau tangle pathologies

  • Alexa Pichet Binette   ORCID: orcid.org/0000-0001-5218-3337 1 ,
  • Chris Gaiteri 2 , 3 ,
  • Malin Wennström 4 ,
  • Atul Kumar 1 ,
  • Ines Hristovska 1 ,
  • Nicola Spotorno 1 ,
  • Gemma Salvadó   ORCID: orcid.org/0000-0002-5210-9230 1 ,
  • Olof Strandberg 1 ,
  • Hansruedi Mathys   ORCID: orcid.org/0000-0003-0186-2115 5 , 6 , 7 ,
  • Li-Huei Tsai   ORCID: orcid.org/0000-0003-1262-0592 5 , 6 ,
  • Sebastian Palmqvist   ORCID: orcid.org/0000-0002-9267-1930 1 , 8 ,
  • Niklas Mattsson-Carlgren 1 , 9 , 10 ,
  • Shorena Janelidze 1 ,
  • Erik Stomrud 1 , 3 , 8 ,
  • Jacob W. Vogel 11 &
  • Oskar Hansson   ORCID: orcid.org/0000-0001-8467-7286 1 , 8  

Nature Neuroscience ( 2024 ) Cite this article

24 Altmetric

Metrics details

  • Alzheimer's disease

Proteomics can shed light on the dynamic and multifaceted alterations in neurodegenerative disorders like Alzheimer disease (AD). Combining radioligands measuring β-amyloid (Aβ) plaques and tau tangles with cerebrospinal fluid proteomics, we uncover molecular events mirroring different stages of AD pathology in living humans. We found 127 differentially abundant proteins (DAPs) across the AD spectrum. The strongest Aβ-related proteins were mainly expressed in glial cells and included SMOC1 and ITGAM. A dozen proteins linked to ATP metabolism and preferentially expressed in neurons were independently associated with tau tangle load and tau accumulation. Only 20% of the DAPs were also altered in other neurodegenerative diseases, underscoring AD’s distinct proteome. Two co-expression modules related, respectively, to protein metabolism and microglial immune response encompassed most DAPs, with opposing, staggered trajectories along the AD continuum. We unveil protein signatures associated with Aβ and tau proteinopathy in vivo, offering insights into complex neural responses and potential biomarkers and therapeutics targeting different disease stages.

Similar content being viewed by others

assignment define python

Large-scale proteomic analysis of Alzheimer’s disease brain and cerebrospinal fluid reveals early changes in energy metabolism associated with microglia and astrocyte activation

assignment define python

Cerebrospinal fluid proteomics define the natural history of autosomal dominant Alzheimer’s disease

assignment define python

Deep proteome profiling of the hippocampus in the 5XFAD mouse model reveals biological process alterations and a novel biomarker of Alzheimer’s disease

AD pathology is characterized by the formation of insoluble protein aggregates, including plaques containing Aβ fibrils and neurofibrillary tangles containing tau fibrils 1 , 2 . Aβ plaques are the first pathology to accumulate in the brain, facilitating the accumulation and spread of tau tangles throughout the neocortex decade(s) later, at which point clear neurodegeneration and cognitive impairment ensue 3 . Although the canonical AD pathology is well known, it is increasingly clear that the disease etiology is multifactorial and biological features beyond Aβ and tau are critical in AD pathogenesis 4 , 5 . To gain more precise insight into AD pathophysiology, it is thus important to deepen our knowledge about which proteins and biological pathways are independently or conjointly associated with insoluble Aβ plaques and tau tangles. Such insight can lead to new biomarkers for disease staging or monitoring and, perhaps more critically, to identification of new disease mechanisms that might be targeted in future therapeutic interventions. Furthermore, although AD is the most common cause of dementia, many non-AD neurodegenerative diseases also lead to dementia and systematic proteomic comparisons between different conditions is important to dissect converging or diverging biological pathways leading to neurodegeneration across diseases 6 .

Proteomic studies on postmortem tissue have shown the benefit of increasing sample size into the hundreds to account for heterogeneity 7 , 8 . More recently, large in vivo cerebrospinal fluid (CSF) proteomic studies have also emerged. Such studies identified many DAPs between diagnostic groups in sporadic and autosomal dominant AD 6 , 9 , 10 , suggesting added value of proteomics to capture cognitive decline 11 , and highlighted common hits across proteomics in brain tissue and biofluids 12 . In most previous studies, AD pathology was measured with CSF Aβ and p-tau levels. However, such fluid biomarkers, especially p-tau, do not directly reflect pathology in the brain and the latter is more accurately captured with positron emission tomography (PET). In particular, tau-PET uniquely captures fibrillar tau tangle pathology, in contrast to soluble p-tau levels measured in CSF or plasma that reflect Aβ-plaque load rather than tau load 13 , 14 . Tau-PET is also more closely related to clinical progression and cognitive decline than fluid markers 15 , 16 , making it a key modality to study in relation to cellular and molecular changes. For these reasons, we combined CSF proteomics in a large sample covering the AD spectrum with PET radioligands measuring the loads of Aβ plaques and tau tangle pathologies, in cross-sectional and longitudinal analyses. Our main goal was to compare individuals based on their underlying AD pathology rather than their cognitive status, therefore capturing proteomic profiles specific to different pathophysiological stages of the disease, and to further characterize such proteins using imaging, transcriptomics and systems biology tools.

We studied 877 deeply phenotyped participants from the BioFINDER-2 cohort, in which thousands of CSF protein levels were analyzed (Olink Explore 3072 proximity extension assay). Individuals were grouped according to their positivity on Aβ and tau pathologies, assessed using the CSF Aβ42/Aβ40 ratio and tau-PET uptake in a temporal meta-region of interest (ROI), respectively ( Methods ). CSF Aβ42/Aβ40 ratio accurately detects the presence of Aβ plaques in the brain (A) 17 and tau-PET detects insoluble tau fibrils in the cortex (T) 18 , 19 . CSF Aβ42/Aβ40 was chosen over Aβ-PET to create the A/T categories to maximize the sample size, because patients with dementia do not undergo Aβ-PET in the BioFINDER cohorts. Protein associations with continuous Aβ-PET standardized uptake value ratio (SUVR) are investigated in subsequent analyses. The A/T classification resulted in four groups of individuals: 352 A − T − (Aβ and tau negative without a neurodegenerative disease diagnosis), 184 A + T − (Aβ positive, tau negative), 231 A + T + (Aβ and tau positive) and 110 non-AD (Aβ-negative patients with clinically diagnosed non-AD neurodegenerative diseases) (Table 1 and Extended Data Table 1 ). As expected, the A + T − group and especially the A + T + group contained more cognitively impaired individuals than the A − T − group (Table 1 ). The first set of analyses focused on the AD spectrum (A − T − , A + T − and A + T + ) and the group of non-AD patients was investigated subsequently. Figure 1 displays an overview of the study design and main analyses.

figure 1

From all BioFINDER-2 participants with Olink proteomic data from CSF, we first assessed DAPs across the different A/T categories. From these DAPs, we then evaluated whether: (1) they were independently related to Aβ plaques or tau tangle pathology load (baseline PET) and rate of change (longitudinal PET); (2) the proteins’ regional gene expression in the brain matched the regional PET pattern; and (3) they were enriched in different cell types or biological processes using enrichment analyses. Last, we derived protein co-expression modules to investigate the overlap between such modules and the DAPs.

Proteomic signatures of Aβ and tau pathology

We first assessed DAPs between individuals without elevated AD pathology compared with those positive for Aβ pathology only (A − T − (no AD pathology) versus A + T − (isolated Aβ pathology)). Subsequently, we compared the A + T − group with individuals positive on both key AD pathologies (A + T − versus A + T + (both Aβ pathology and cortical tau tangle pathology)). These comparisons allowed investigation of protein differences along the expected pathological cascade of AD. All analyses included age, sex and mean overall protein level as covariates. Fifty-one proteins were differentially abundant ( P value adjusted for false discovery rate (FDR) ( P FDR)  < 0.01) between A + T − and A − T − . Almost all of those (94%, 48 of 51) were upregulated in individuals with isolated Aβ pathology (Fig. 2a ) and some of the main proteins included SMOC1, MDH1, SNAP29, SOD1 and SOD2.

figure 2

a , b , Volcano plots depicting DAPs in different groups: A − T − ( n  = 352) versus A + T − ( n  = 184) ( a ) and A+T − versus A + T + ( n  = 231) ( b ). Models included age, sex and mean overall protein level as covariates. The red line represents the threshold of P FDR  < 0.01, above which we considered proteins for subsequent analyses. Only the top proteins are labeled for legibility on the volcano plots. c , Venn diagram summarizing the comparisons shown in a and b based on proteins significant at P FDR  < 0.01 in the main cohort, BioFINDER-2. d , Volcano plots depicting DAPs in BioFINDER-1 (validation cohort, Olink proteomics) between A − ( n  = 415) and A + ( n  = 292) individuals, including age, sex and mean overall protein level as covariates. e , Volcano plots depicting DAPs in ADNI (validation cohort, SomaLogic proteomics) between A − ( n  = 212) and A + ( n  = 215), including age and sex as covariates. Some proteins appear twice given that different aptamers measured the same protein. A − or A + status was based on Aβ-PET. In d and e , the analysis was restricted to proteins that overlapped with those available in BioFINDER-2 (BF2). All standardized β values displayed come from two-sided linear regressions and all P values were adjusted for FDR.

Next, we studied which proteins changed with progression to cortical tau fibrillar pathology, that is, which proteins further differed when comparing A + T − with A + T + individuals. This comparison revealed more DAPs, with 45 upregulated proteins (48%) in individuals with both elevated Aβ plaque and tau tangle pathologies and 48 downregulated proteins (52%) (Fig. 2b ). Overall, 15 proteins (CNP, CRKL, DTX3, GLOD4, ITGAM, ITGB2, MAP2K1, MAPT, MIF, NRGN, NSFL1C, PPP3R1, SDC4, TMSB10 and TXNRD) were commonly upregulated across the two comparisons, being elevated already in the A + T − participants and showing further elevation in the A + T + group (Fig. 2c ). We hereafter refer to them as the ‘core proteins’, given that they were differentially abundant throughout the AD spectrum and increased with advancing AD pathology. DAPs specific to the A + T − versus A − T − contrast are subsequently referred to as ‘early DAPs’. Of those, SMOC1 showed the strongest effect and was elevated in A + T − versus A − T − , but not further as fibrillar tau pathology develops (that is, when comparing A + T + with A + T − ). The proteins uniquely differentially abundant in A + T + (against A + T − ) were split into two categories of ‘late DAPs’: upregulated or downregulated late DAPs. Overall, 127 unique proteins were differentially abundant along the AD spectrum (Fig. 2c ) and were retained for subsequent analyses. All summary statistics from the differential expression analyses are reported in Supplementary Table 1 .

We validated a subset of the DAPs in an independent cohort of 631 participants from the BioFINDER-1 study (Extended Data Table 2 ), in which a smaller set of CSF proteins was quantified with Olink. From all proteins analyzed in BioFINDER-2, 202 overlapped in BioFINDER-1. In this validation cohort, we compared participants based on their Aβ status (213 A + versus 418 A − ) because tau-PET was unavailable (Fig. 2d and see Supplementary Table 2 for all statistical results). Overall, there was 92% consistency in differential abundance analyses between the two cohorts, with the strongest DAPs being found in both datasets. Similarly, we also replicated many of the main DAPs using 463 ADNI participants who had CSF proteomics quantified with SomaLogic (251 A + and 212 A − ). We analyzed SomaLogic proteins that matched the ones analyzed in BioFINDER-2, yielding an overall 90% consistency in significant DAPs between cohorts (Fig. 1e and Supplementary Table 3 ). SMOC1 stood out as the main protein more abundant in A + compared with A − in the AD Neuroimaging Initiative (ADNI), consistent with previous results.

In complementary analyses, we investigated whether the regional gene expression pattern of DAPs correlated with the pattern of Aβ or tau aggregates in the brain using imaging transcriptomics (Extended Data Fig. 1, Supplementary Table 4 and Supplementary Results ). Overall, we found only moderate associations between regional RNA expression of seven DAPs and the tau-PET pattern and one with the Aβ-PET pattern.

Distinct biological processes between early and late proteomes

We next investigated whether the DAPs were associated with the load of either insoluble Aβ-containing plaques or tau fibrillar aggregates (cross-sectional analyses, Fig. 3a ) as well as the accumulation over time of the two proteinopathies (longitudinal analyses, see below). Across all cognitively unimpaired (CU) individuals and patients with mild cognitive impairment (MCI), SMOC1 and ITGAM showed the strongest positive associations with global Aβ-PET levels (Fig. 3a ), all associations being independent of tau-PET levels. When restricting the sample to Aβ-positive individuals, SMOC1 was the only protein that remained associated with Aβ-PET uptake, independently of tau. Several core and late upregulated proteins were independently related to tau-PET load, with associations being even clearer in the Aβ-positive group, as expected because tau-PET uptake is elevated in Aβ-positive individuals. In particular, FABP3, ENO2, ENO1 (enolase 1 and 2), MAPT, NRGN (neurogranin), MIF, TMSB10 and GLOD4 showed the strongest associations with tau-PET load: higher levels of these proteins in the CSF were related to greater baseline tau fibrillar pathology independent of Aβ-PET load (all standardized coefficients between 0.13 and 0.21, P FDR from 0.01 to <0.001; Fig. 3a ). See Supplementary Table 5 for all statistical results across the 127 DAPs and Extended Data Fig. 2 for results when Aβ- and tau-PET are investigated individually (that is, separate regression models for each pathology).

figure 3

a , Standardized (Std) β coefficients from linear models relating AD fibrillar pathology (Aβ- and tau-PET SUVR both included as independent variables) to the CSF protein levels. Models included age, sex and mean overall protein level as covariates. Only proteins with significant associations are reported in the figure. b , Proportion of expression by cell type from single-cell transcriptomics data from the middle temporal gyrus for all proteins shown in a . To improve legibility, only average expressions >5% are displayed. c , Summary terms from functional enrichment analyses using GO databases from the different categories of proteins. For enrichment analyses the 1,331 Olink proteins were used as background. All linear regressions performed were two sided and P values were adjusted for FDR. * P FDR  < 0.05, ** P FDR  < 0.01, *** P FDR  < 0.001. adj., adjusted.

To gain greater insight into cell-type specificity of the DAPs, we calculated the percentage of messenger RNA expression of those proteins across different cell types, based on single-cell transcriptomics data (Allen Human Brain Atlas (AHBA); Fig. 3b and Extended Data Fig. 3 ). Most proteins were mainly expressed in neurons, split between glutamatergic and γ-aminobutyric acid (GABA)-ergic neurons. However, the few proteins strongly associated with levels of Aβ were expressed in different cell types: SMOC1 was the only protein specific to oligodendrocyte precursor cells (OPCs) and ITGAM and ITGB2 were among the few proteins specific to microglia. The same cell-type specificity was found when using a large single-cell transcriptomics dataset from 427 ROSMAP (Religious Orders Study/Memory and Aging Project) participants split into A + and A − instead, based on neuropathology (Extended Data Fig. 4 ) 20 . Comparing the neuronal versus glial expression of proteins associated with Aβ-PET versus the ones specifically associated with tau-PET (based on Fig. 3a ), Aβ-related proteins were significantly more expressed in glial cells than the tau-related proteins, which were more neuronal ( χ 2  = 5.6, P  = 0.03; 7 glial/10 neuronal proteins for Aβ; 0 glial/10 neuronal for tau). Quantitative cell-type enrichment analyses in the core, early, upregulated and downregulated groups of DAPs showed no significant enrichment of any cell type (Extended Data Fig. 5a ). Functional enrichment analysis allowed further understanding of the biological roles of the DAPs (summary results in Fig. 3c ; see Supplementary Table 6 for all significant gene ontology (GO) terms). Early proteins were enriched for terms related to synaptic transmission, cellular ubiquitination and detoxification and cellular response to free radicals, corresponding to different cellular components (mitochondria, cytoplasm and cell projection) (Fig. 3d ). Upregulated late proteins were enriched for terms related to ATP process and glycolysis in the cytosol. Downregulated late proteins were enriched in cellular components related to nucleus and structures not bounded by lipid membrane (for example, ribosomes, cytoskeleton and chromosomes). In contrast, core proteins were enriched for terms related to the cellular membrane. In the more restricted set of proteins specifically associated with either Aβ- or tau-PET uptake (for example, from Fig. 3a ), the tau-related DAPs were enriched for two biological processes similar to the late upregulated DAPs: NAD biosynthetic process and nucleotide phosphorylation (both P FDR  < 0.001), with the contributing proteins being ENO1, ENO2 and GPI (glucose-6-phosphate isomerase). The Aβ-related proteins had no significant GO enrichment.

Given the predominance of SMOC1 in the proteomic literature and that it showed the strongest associations with Aβ, we further characterized this protein using postmortem human data. We confirmed higher SMOC1 expression predominantly in OPCs with greater postmortem AD pathology and high SMOC1 levels in brain tissue from patients with AD (Extended Data Fig. 6 ).

Select proteins presage subsequent tau-PET accumulation

We next assessed proteins associated with the accumulation of AD fibrillar brain pathology over time, that is, Aβ- and tau-PET rate of change. There were no associations between rate of change of Aβ load and any CSF protein level at baseline. On the other hand, higher levels of many core and late upregulated DAPs were related to faster accumulation of tau fibrillar pathology independent of Aβ plaques accumulation (Fig. 4 ). The proteins mentioned above related to baseline tau-PET load were also the ones most strongly associated with longitudinal tau-PET (FABP3, ENO1, MAPT, NRGN, MIF and GLOD4). Certain proteins such as YWHAQ, DDT, RWDD1, DNPH and TBCA showed stronger associations with the longitudinal tau-PET rate of change than baseline tau-PET load (all standardized coefficients between 0.12 and 0.24, P FDR from 0.01 to <0.001; Fig. 4 ). The 13 proteins most associated with tau-PET rate of change (all P FDR  ≤ 0.01 in Aβ-positive participants) were all preferentially expressed in neurons, but were not enriched in any biological processes. In the Supplementary Results , we detail analyses based on all Olink proteins.

figure 4

Standardized β coefficients from linear models relating accumulation of AD fibrillar pathology (Aβ- and tau-PET rate of change both included as independent variables) to the CSF protein levels. Models included age, sex and mean overall protein level as covariates. Only proteins with significant associations with tau-PET rate of change are displayed, because there were no associations with Aβ-PET rate of change. The top row included all CU and MCI participants and the bottom row only Aβ-positive CU and MCI participants (as in Fig. 3a ). All linear regressions performed were two sided and P values were adjusted for the FDR. * P FDR  < 0.05, ** P FDR  < 0.01, *** P FDR  < 0.001.

The AD proteome is distinct from generalized neurodegeneration

Having identified and characterized proteins altered along the A/T continuum, we next sought to establish the specificity of these proteins to AD. To do this, we examined differential protein abundance in the non-AD group (Extended Data Table 1 for all diagnoses). We compared the non-AD group with the A + T + group (Fig. 5a ), as well as the A − T − group (Fig. 5b , see statistical results in Supplementary Table 1 ), to assess which DAPs in AD were also differentially abundant in other neurodegenerative contexts. The proteins showing the strongest elevation in A + T + versus non-AD were all proteins previously identified in analyses restricted to the AD continuum, suggesting their strong specificity to AD. This finding is underscored in Fig. 6c (four-wave Venn diagram), in which 14 of the 15 core proteins were differentially abundant only in the AD continuum (that is, upregulated in A + T − versus A − T − and in A + T + versus both A + T − and non-AD). Similarly, almost all early proteins were specific to AD. However, zooming in on proteins that differed in non-AD compared with A − T − and in AD (Fig. 5d ), we identified DAPs more generally associated with neurodegeneration. A particular intersection of 20 proteins stood out, composed of 18 late downregulated proteins (BSG, CBLN4, CD99L2, CDH6, CHL1, CNTN3, EPHA10, ESAM, GFRA2, LYVE1, MANSC1, MEGF9, NELL1, PAM, PTPRN2, PTPRS, TNFRSF4 and WFDC2) and 2 upregulated late proteins (NfL and HGF). These proteins were differentially abundant in non-AD compared with A − T − , as well as in A + T + compared with A + T − . Enrichment analyses on the set of significantly less abundant proteins common in the non-AD and the A + T + groups highlighted biological processes related to axonogenesis, axon development and neuronal morphogenesis (Supplementary Table 7 ), supporting the notion that these processes are probably impaired in the later stage of neurodegeneration.

figure 5

a , b , Volcano plots depicting DAPs in different groups: A + T + ( n  = 231) versus non-AD neurodegenerative diseases ( n  = 110) ( a ) and A − T − ( n  = 352) versus non-AD neurodegenerative diseases ( b ). Models included age, sex and mean overall protein level as covariates. The red line represents the threshold of P FDR  < 0.01, above which we considered proteins for subsequent analyses. c,d , Venn diagram summarizing the comparisons shown in a and b along with those based on the A/T categories in Fig. 2a,b based on proteins significant at P FDR  < 0.01. The red text corresponds to ‘early’ proteins, the black to ‘core’ proteins and the blue to ‘late’ proteins as per categories defined in previous analyses. c , The DAPs highlighted are those related to AD. d , The DAPs highlighted are those at the intersections between AD and non-AD neurodegenerative disease. All standardized β values displayed come from two-sided linear regressions and all P values were adjusted for the FDR. e , Box plots of exemplary DAPs between the different A/T categories as well as non-AD neurodegenerative diseases. The values correspond to residual values after regressing out age, sex and mean overall protein level. In all box plots, the box limits represent the first and third quartiles and the whisker extends to 1.5× the interquartile range. The red dot represents the mean and the red line extends ±1 s.d. f , Depiction of steps and spectral embedding from which the inferred trajectory of AD pathology (pseudotime) was estimated. g , All proteins shown in f plotted against the pseudotime using GAMs. The first dashed line corresponds to Aβ positivity and the second to tau positivity. Error bands around the data correspond to the 95% confidence interval (CI).

figure 6

a , Bar plots showing the number of proteins in each module derived based on protein co-expression. b , Bar plots showing the distribution of the DAPs in each protein co-expression module. c , Average protein level from each module plotted against the pseudotime using GAMs. d , Summary figure of associations with baseline Aβ- and tau-PET load as well as tau-PET rate of change, cell-type enrichment and summary terms from functional analyses of biological processes for each module. For simplicity, only significant results from two-sided regressions ( P FDR  < 0.05; P values were adjusted for the FDR) are displayed in colored cells. e , Average protein level from modules 2 and 5 (the two modules containing most DAPs) plotted against Aβ- and tau-PET load (baseline) and tau-PET rate of change (longitudinal) using GAMs. f , Key AD markers along with the average DAPs from modules 2 and 5 plotted against the pseudotime using GAMs. The tau tangles were measured from tau-PET SUVR in a temporal meta-ROI, Aβ and phosphorylated tau were measured, respectively, from Elecsys CSF Aβ40/Aβ42 and p-tau181, atrophy was measured from cortical thickness in the temporal lobe and cognition was measured from the MMSE total score. In all panels, error bands around the data correspond to the 95% CI.

We represented some of the key DAPs pertaining to the different group comparisons in Fig. 5e , with more description of these proteins in Supplementary Results . Of note, DOPA decarboxylase (DDC) was the protein showing the greatest elevation specifically in non-AD compared with both A − T − and A + T + . Higher levels of DDC in the non-AD group was particularly attributable to the high proportion of patients with a Parkinsonian disorder (Supplementary Fig. 3 ), in line with recent work 21 , 22 . Some well-established AD-related proteins were also found to be differentially abundant, specifically along the A/T continuum (MAPT and NRGN; core proteins), or were elevated in both the AD and the non-AD groups (NfL and CHI3L1 (YKL-40)), and are displayed in Extended Data 7a .

Protein trajectories along a pseudotime of AD progression

To provide a granular and sequential measure of advancing AD pathology, we used trajectory inference methods to compare key DAPs on a common continuous scale, representing increasing Aβ and fibrillar tau pathology (Fig. 5f and Supplementary Fig. 4 ). Each participant was assigned to a point along this AD progression pseudotime. The key individual proteins were then plotted along the pseudotime, revealing how their levels changed along the AD progression (Fig. 5g ). SMOC1 (early protein) increased early on and plateaued later in the progression, accurately recapitulating the earlier results from contrasting the different A/T categories. Linear increases in TNXRD1 (core protein) and YWHAQ (late protein) were detected past the point of Aβ positivity. FABP3 also showed rapid elevation, particularly in the last quarter of the pseudotime, with levels exceeding all other proteins. The established AD-related proteins MAPT, NRGN and NfL also showed linear increases past the point of Aβ positivity, with levels being highest for MAPT (Extended Data Fig. 7b ).

Metabolic and inflammatory alterations along the AD spectrum

Protein levels often vary as part of larger networks of co-expressed proteins encompassing different molecular processes. We therefore sought to characterize whether the DAPs that we identified were mostly independent or whether they were particularly salient nodes of larger modules of proteomic changes in response to AD pathology. To do so, we grouped all Olink proteins into biological modules based on protein co-expression using a community detection algorithm 23 . Six principal modules were identified (Fig. 6a and see Supplementary Table 1 for protein assignment to each module). Modules 2 and 5 were particularly enriched with key AD DAPs (50 DAPs in module 2 and 42 in module 5). Module 2 included most of the late downregulated proteins in AD (36 of 48). Module 5 included a mix of core proteins (13 of 15), early and late upregulated DAPs, in similar proportions (Fig. 6b ). Plotting each module’s average protein levels along the pseudotime recapitulated the strong association between modules 2 and 5 with the inferred AD trajectory, in opposite directions (Fig. 6c ). While levels in module 2 decreased with increasing AD pathology—in line with the high proportion of downregulated proteins composing this module—levels in module 5 increased. Modules 2 and 5 were also most strongly associated with both with baseline and longitudinal Aβ and tau-PET (Fig. 6d,e ).

To better characterize the different modules, cell-type and functional enrichment analyses were also performed (summary in Fig. 6d ; all significant terms in Supplementary Table 8 ). Module 2 was enriched with proteins expressed in microglia and endothelial cells (Extended Data Fig. 5b ) and mainly related to immune response, cell adhesion and lymphocyte and leukocyte regulation. A high concentration of inflammatory (cytokines and interleukins) and antigen-presenting microglia proteins were found in this module, based on microglia states previously identified from brain tissue in AD 24 (Extended Data Fig. 8 ). Module 5 contained proteins related to amino acid metabolic process, microtubule organization and DNA repair. Module 4 contained only a few DAPs, but was associated with tau-PET and enriched with proteins related to oligodendrocytes and OPCs (Fig. 6c,d ). Lastly, half of our module assignment showed consistency with previous CSF proteomic modules, particularly those related to lysosomes, the complement cascade and axonal development (details in Supplementary Fig. 5 ).

Trajectories of DAPs in modules 2 and 5 were integrated with key markers of AD, such as CSF Aβ42/Aβ40, CSF p-tau181, tau-PET, neurodegeneration (measured with cortical thickness) and cognition (measured with the Mini-Mental State Examination (MMSE)) along the AD pseudotime (Fig. 6f ). This visualization helps resolve the differing phases of molecular changes in response to the stereotyped accumulation of AD pathology. Proteins related to metabolic processes (module 5) become abundant in CSF before widespread deposition of fibrillar tau and increase in a manner closely tied to abundance of soluble p-tau in the CSF. Meanwhile, downregulated proteins decrease later in the disease continuum when neurodegeneration and cognitive decline are also present.

In the present study, we identified, validated and characterized several proteins differentially abundant in the CSF of individuals spanning the full spectrum of the AD continuum. We measured the presence and magnitude of AD pathology using Aβ- and tau-PET biomarkers (which quantify fibrillar pathology), both at baseline and over time. Previous proteomic studies had used soluble p-tau as a measure of tau pathology 12 , 25 , 26 , but rather it is the tau aggregate pathology that is independently related to neurodegeneration and cognitive decline 15 , highlighting the importance of using tau-PET. Many of the DAPs that we identified were validated in separate datasets and we tested their specificity to AD by additionally contrasting them in patients with other neurodegenerative diseases. We identified distinct sets of proteins that are altered at different stages of the disease, that is, dysregulated early on when participants have only isolated Aβ-plaque pathology versus further altered with the development of fibrillar tau pathology. Many proteins participated in two co-expression networks that became increasingly disturbed alongside the pathological progression of AD and which probably represent key metabolism- and inflammation-related pathways disrupted in AD. Our analyses suggest that the sequential aggregation of Aβ and tau pathologies is mirrored by specific phases of molecular changes and that many of these processes are highly specific to the AD pathophysiological cascade.

The earliest stage of AD involves a continuous accumulation of Aβ-plaque load in the cerebral cortex but without neocortical aggregation of tau tangles 27 . This is a critical stage to study for potential intervention given that cognition and brain integrity remain relatively undisturbed. A specific subset of proteins was already altered in the CSF of individuals within this early AD population. A notable finding from this group included the increased levels of SMOC1 in the CSF, which tracked Aβ-plaque load in the brain, even among participants exceeding the Aβ-positivity threshold. SMOC1 has also been found to be the main protein for which changes in abundance precede Aβ abnormality in autosomal dominant AD 10 , as well as one of the most positively correlated proteins with Aβ-PET in a subset of the ADNI cohort 11 . SMOC1 is expressed in OPCs, is a core component of a biological module related to the extracellular matrix (the ‘matrisome’) 7 and has been repeatedly found as a key DAP in CSF and brain tissue of patients with AD 8 , 12 , 26 , 28 , 29 . Recent work demonstrated cell senescence of OPCs in the Aβ-plaque environment, with OPCs adopting an inflammatory phenotype and being unable to differentiate into oligodendrocytes 30 , which might provide insights into the role of SMOC1 in AD. MAPT was also among the proteins that emerged as being already elevated in A + T − versus A − T − , foreshadowing the fibrillar tau deposition that ensued in A + T + , in line with the different soluble tau variants that reach abnormal levels in CSF or plasma as Aβ plaques become present in the brain 31 , 32 .

A set of 15 core proteins emerged, the abundance of which in the CSF increased throughout the disease continuum, being elevated in people with isolated Aβ pathology but even further elevated with emergence of cortical fibrillar tau pathology. These proteins included established markers of AD like MAPT (encoding the tau protein) and neurogranin, but also proteins related to immune response (ITGAM, ITGB2 (both expressed in microglia) and MIF), apoptosis (CRKL and MP2K1) and calcium-dependent signaling (PPP3R1, part of the calcineurin complex). These core proteins showed a very high overlap with the proteins changed in autosomal dominant AD 9 ; from the top 20 DAPs previously identified in autosomal dominant AD, all except 1 were among the DAPs identified in the present study, with 11 part of the core proteins. These results suggest great commonality between sporadic and autosomal dominant AD and, together with results from other recent proteomic studies 6 , 12 , support these proteins for improved molecular staging of AD and identification of new therapeutic targets. Furthermore, we found that almost all of these core proteins were specific to AD, because they were also upregulated in the A + T + group compared with the non-AD group. The predominance of microglia proteins was also found in one of the co-expression modules changing with AD pathology, highlighting a central component of the immune response in AD.

Proteins altered in the early stages of AD (A + T − ) were enriched in different biological processes, including protein ubiquitination, cellular detoxification, synaptic transmission and response to superoxide species. On inspection of the different proteins individually, the link to oxidative stress was evident for many proteins (DDAH1, GGCT, GSR, PARK7, SOD1 and SOD2), highlighting probable early neural responses to accumulating AD pathology. When contrasting participants with isolated Aβ plaque pathology (A + T − ) to individuals with both cortical Aβ plaque and fibrillar tau pathologies (A + T + ), we found the latter group to be characterized by both increased and decreased abundance of many proteins in the CSF. Several upregulated proteins related to ATP metabolism were independently associated with tau fibrillar load and accumulation (for example, ENO1, ENO2, YWHAQ (member of the 14-3-3 family) and FABP3). As such, dysfunction in energy metabolism is a key alteration in AD pathophysiology and occurs later in the disease progression. The metabolic and glycolytic changes were also reflected in the protein co-expression modules, where the average protein level in such a module was increased late in the disease continuum, when widespread fibrillar pathology is present. The 14-3-3 proteins are also increasingly reported as differentially abundant in AD across studies 11 , 12 .

Downregulated proteins had seldom been reported in previous studies, where the main comparisons were usually limited to contrasting controls and patients with AD and often without reliable biomarker evidence. The downregulated proteins found only in the A + T + group are in line with evidence from cell biology, suggesting that the presence of both Aβ and tau pathologies leads to neuronal hypoexcitability, and thus potentially reduced protein secretion, in opposition to the presence of Aβ alone, which leads to hyperexcitability 33 , 34 , 35 . Also, about half of these proteins were decreased across neurodegenerative diseases, not limited to AD, perhaps indicating involvement in the neurodegenerative process (or response to it) more generally. Several proteins from this group (NELL1, CDH6, CBLN4 and TNFRSF4) were among those for which regional gene expression correlated with the regional tau-PET pattern, but the levels of these proteins in the CSF were not associated with the tau-PET load. We speculate that these proteins may be upstream of tau fibril accumulation, but also acknowledge that protein abundance and RNA expression are not highly correlated 7 , 36 . As expression of transcripts encoding these proteins was particularly high in regions of the temporal lobe, it might also highlight the vulnerability of these brain regions to neurodegeneration more generally, beyond AD.

The regional distribution of transcripts for the receptor tyrosine kinase MET was also associated with the distribution of fibrillar tau pathology. It is interesting that levels of both MET and its activating ligand HGF (hepatocyte growth factor) were altered in the CSF among individuals with AD pathology. MET and HGF are part of the HGF receptor signaling pathway, a well-established pathway related to apoptosis and cell survival in neurodegenerative diseases 37 . In total, across all DAPs identified, five (HGF, MET, CRLK, MAP2K1 and PLAU) are part of the HGF signaling pathway 38 . These results suggest an important pathway in regulating cell death in AD, and potentially in other neurodegenerative diseases, because MET and HGF were also differentially abundant in non-AD neurodegenerative diseases compared with the A − T − group. Further supporting this notion, two ongoing phase 2 trials in patients with mild-to-moderate AD dementia ( NCT04488419 ) and Parkinson’s disease (PD) dementia or dementia with Lewy bodies ( NCT04831281 ) are testing a drug targeting HGF and MET.

Our study has several strengths, including the large sample size and the deeply phenotyped participants with cross-sectional and longitudinal Aβ- and tau-PET, validation in external datasets and comparisons with other neurodegenerative diseases This work was limited to CSF and we acknowledge that future studies integrating CSF and plasma proteomics could help identify more accessible markers of interest to refine disease staging. Patients with AD and non-AD neurodegenerative diseases probably harbor several pathologies and further understanding of the molecular changes underlying different pathologies will be of great interest. Inference drawn from imaging transcriptomics was limited to the few brains available and to molecular changes that happen post mortem. Still, we applied thorough statistical analyses to ensure robustness of the results. Our sample was limited in terms of ethnic and racial diversity, with most participants being self-reported white. In light of recent work highlighting different protein changes with race 39 , it will be important to expand proteomic studies to more diverse populations.

Overall, using a multi-omics approach, we provided a new insight on key proteins and molecular pathways that co-occur with, and follow the accumulation of, Aβ and tau load along the AD continuum. Our study highlights the importance of focusing on the underlying pathology, because certain groups of proteins could be uncovered specifically in relation to isolated Aβ-plaque pathology, to both Aβ-plaque and tau tangle pathologies, or were increasingly altered as pathology increased. A portion of these proteins was also altered in other neurodegenerative diseases, suggesting in part common pathways at play leading to degeneration. The comprehensive analyses yielded new potential targets for therapeutic approaches at different stages of the disease.

Participants: BioFINDER-2 cohort

Participants were part of the ongoing prospective Swedish BioFINDER-2 cohort ( NCT03174938 , http://www.biofinder.se ), which spans the full spectrum of the AD continuum, ranging from adults with intact cognition or subjective cognitive decline, MCI to AD dementia, as well as patients with non-AD neurodegenerative diseases. All participants were recruited at Skåne University Hospital and the Hospital of Ängelholm, Sweden. The main inclusion criteria were, as described previously 40 , age ≥40 years, fluency in the Swedish language, MMSE between 27 and 30 or 26 and 30 depending on age of CU participants, between 24 and 30 for MCI and ≥12 for patients with AD dementia. Exclusion criteria include unstable systemic illness, neurological or psychiatric illness, alcohol or substance abuse or refusing a lumbar puncture or neuroimaging. MCI diagnosis was established if participants performed <1.5 s.d. below the normative score on at least one cognitive domain from an extensive neuropsychological battery examining verbal fluency, episodic memory, visuospatial ability and attention/executive domains 41 . AD dementia diagnosis was determined using the criteria for dementia caused by AD from the Diagnostic and Statistical Manual of Mental Disorders , 5th edn (DSM-5) 42 and, if a positive diagnosis was made, this was confirmed using Aβ biomarkers based on the updated National Institute on Aging and Alzheimer's Association (NIA-AA) criteria for AD 43 . The group of non-AD neurodegenerative diseases was composed of patients who fulfilled criteria for dementia owing to frontotemporal dementia, PD with dementia, subcortical vascular dementia, PD, progressive supranuclear palsy, multiple system atrophy, corticobasal syndrome or primary progressive aphasia. Clinical diagnosis was determined according to the main criteria for each disease, as detailed elsewhere 40 , and was done either at baseline or during the course of follow-up visits. The study was approved by the Regional Ethics Committee in Lund, Sweden. All participants gave written informed consent to participate and they were compensated for each study visit. All data for the current study were acquired between April 2017 and December 2022. All participants included in the present study had complete proteomic data and CSF Aβ42/Aβ40 and tau-PET measures available at baseline, totaling 935 individuals.

CSF proteomic measures

CSF samples from baseline visits were analyzed with a validated, highly sensitive and specific Olink (Uppsala, Sweden) proteomic assay, an antibody-based Proximity Extension Assay 44 . In BioFINDER-2, a total of 2,943 proteins, corresponding to Olink Explore 3072, were measured across 8 multiplex protein panels (oncology, neurology, cardiometabolic, inflammation, oncology II, neurology II, cardiometabolic II, inflammation II), each containing 367–369 proteins. All measurements were performed by the company, blinded to any information on the samples. Samples were randomized across plates and appropriate controls were included on each plate, to allow for thorough quality control (QC) as per the Olink technology (see ref. 45 for all details). Samples that did not meet thorough QC received a warning label. Each assay had a lower limit of detection (LOD) provided, defined as 3 s.d. above background (based on three negative controls included on every plate). Protein levels were reported as normalized protein expression (NPX) values, a relative quantification unit corresponding to a log 2 scale provided by Olink. Only proteins for which at least 70% of participants had levels above LOD were retained for further analysis, resulting in 1,331 proteins. Distribution of the proteins meeting the LOD criteria across the different Olink panels is provided in Supplementary Table 9 . From these proteins, a few datapoints had an assay warning label (about 0.2% of the data, 46–64 measurements on 37 proteins) and were excluded from analyses, but all datapoints below the LOD were kept in analysis, as per the Olink recommendations and in line with recent analyses 6 , 46 . Proteomic data distribution was assumed to be normal but this was not formally tested. We noted a clear interaction between APOE 4 genotype and apoprotein E (ApoE) levels in Olink and provided all analyses to that effect in Supplementary Results and Supplementary Figs. 1 and 2 . Given that we cannot yet explain this effect, we have removed ApoE from the main analyses and discuss it in the Supplementary Results section ‘Olink CSF ApoE results’. We should note that results from all proteins except ApoE were not influenced by APOE 4 genotype.

Mean overall protein level as a covariate

We also calculated a measure of the mean overall protein levels, to be included in all analyses as a nuisance covariate. The rationale to include such a measure was to account for interindividual variability in CSF dilution levels, which could be the result of different rates of clearance or production. We recently showed that nearly half of the individual protein levels correlated highly with the overall protein average 47 , making it an important confounder to consider in proteomic analyses. For instance, for a given individual, higher or lower values across proteins might depend on their overall CSF level and, as such, accounting for this CSF dynamics should be considered. In the present study, we calculated the average z -scored NPX values from all highly detected proteins, that is, proteins for which levels for >90% of participants were above LOD ( n  = 1,157), to ensure that this overall average reflected proteins measured with high confidence. We refer to this measure as the mean overall protein level and it was included as a covariate in all analyses, along with age and sex.

CSF markers of Aβ

CSF Aβ42 and Aβ40 were measured using the Elecsys immunoassays (Roche Diagnostics) 48 . A pre-established cutoff of 0.08 on the CSF Aβ42/Aβ40 ratio was used to define Aβ positivity based on Gaussian mixture modeling 40 . A minority of participants did not have Elecsys measurements available, in which case clinical routine assays and pre-established cutoffs were used ( n  = 31 with the Lumipulse G assay, cutoff of 0.072 (ref. 49 ); n  = 20 with Meso Scale Discovery, cutoff of 0.077). Participants below the cutoffs were considered to be Aβ positive (A + ), which was used to form the different A/T groups.

PET acquisition and processing

Aβ- and tau-PET images, acquired on digital GE Discovery MI scanners, were available in the BioFINDER-2 cohort. For tau-PET, acquisition was done 70–90 min after injection of ~370 MBq of [ 18 F]RO948 and was available for all participants. For Aβ-PET, acquisition was done 90–110 min after injection of ~185 MBq of [ 18 F]flutemetamol. As per the study protocol, patients with dementia did not undergo Aβ-PET. Images were processed according to our pipeline, described previously 50 . Briefly, PET images were attenuation corrected, motion corrected, summed and registered to the closest T1-weighted magnetic resonance (MR) image processed through the longitudinal pipeline of FreeSurfer v.6.0. Structural T1-weighted images were acquired from a magnetization-prepared rapid gradient echo (MPRAGE) sequence with 1 mm isotropic voxels on a Siemens 3T MAGNETOM Prisma scanner (Siemens Healthineers). The SUVR images were created using the inferior cerebellar gray matter as the reference region for [ 18 F]RO948 and the whole cerebellum for [ 18 F]flutemetamol 16 .

Both continuous values and binary classification from key regions were used in different analyses. For Aβ-PET, the average SUVR was calculated from a global neocortical ROI, including prefrontal, lateral temporal, parietal, anterior cingulate and posterior cingulate/precuneus, and the cutoff for positivity was SUVR = 1.03, defined from Gaussian mixture modeling 16 . For tau-PET, the average SUVR was calculated in a temporal ROI composed of the entorhinal cortex, amygdala, parahippocampal gyrus and inferior temporal and middle temporal gyri 51 . A cutoff of 1.36 SUVR was defined from Gaussian mixture modeling, which was the same cutoff if calculating 2 s.d. from the mean of CU Aβ-negative individuals. Participants above the cutoff were considered tau-positive (T + ), which was used to form the different A/T groups throughout the article.

A subset of participants also had longitudinal Aβ- and tau-PET data available. For these participants, we derived individual rate of change by fitting linear mixed-effect models with random slope and intercept using the R package lme4 v.1.1-31, where PET SUVR was the dependent variable and time in years from the baseline scan date was the independent variable. The slope of each participant from those models was used to represent Aβ- or tau-PET SUVR change per year. Participants had between two and four follow-up scans. We focused subsequent analyses on participants who had both Aβ and tau rate of change, which resulted in 458 CU and MCI participants with, on average, 2.6 ± 1.0 years of follow-up (range: 1.3–4.4 years).

Differential protein expression analyses

Linear models were used to establish DAPs between A/T groups, including age, sex and the mean overall protein level as covariates. The first set of comparisons captured the AD continuum: an A − T − versus A + T − contrast captured DAPs in the early stages of AD progression, whereas an A + T − versus A + T + contrast captured DAPs associated with later stages of biomarker progression. Next, separate models contrasting Aβ-negative patients with non-AD neurodegenerative diseases to A − T − and A + T + were performed, to investigate proteins specific to AD as well as proteins generally related to neurodegeneration. The A − T + category included only six participants (three CU, three MCI), and thus this category was not included in analyses. Some 52 participants with non-AD neurodegenerative diseases were Aβ positive and were also excluded, to capture changes independently as much as possible from AD pathology in the non-AD group. Ultimately, differential expression analysis was performed on 877 participants and the results are presented as volcano plots. Analyses were adjusted for multiple comparisons using the Benjamini–Hochberg method (FDR). Given that one of our main objectives was to characterize the strongest DAPs, we focused subsequent analyses on DAPs that were significant at P FDR  < 0.01.

Validation of key DAPs in independent datasets

To further validate some of the key proteins identified in the main cohort (BioFINDER-2), we analyzed a subset of BioFINDER-1 participants (CU older adults and patients with MCI) who also had Olink proteomic data from CSF available, although fewer proteins and different panels were available (neuro-exploratory, neurology, inflammation and cardiovascular III). BioFINDER-1 ( NCT01208675 ) is an older prospective cohort in which participants were enrolled between 2010 and 2014, with similar inclusion and exclusion criteria as for BioFINDER-2 (ref. 52 ). Participants ( n  = 631) who had proteomic data and CSF Aβ42/Aβ40 to determine their Aβ status were included. We analyzed the proteins overlapping with those available in the main cohort, for a total of 202 proteins. Tau-PET was not available in this cohort. We thus performed differential expression analyses comparing A − with A + participants.

We also validated some of the key proteins in ADNI, where CSF proteomic levels of approximately 7,000 proteins were measured with SomaScan 7K (v.4.1) for 758 individuals across all ADNI studies, as described previously 12 . We selected participants who had proteomic data available within an 18-month window from Aβ-PET, yielding 463 participants for analyses. ADNI is a multisite study launched in 2003 as a public–private partnership. The primary goal of ADNI has been to test whether serial MRI, PET, other biological markers and clinical and neuropsychological assessment can be combined to measure the progression of MCI and early AD. For up-to-date information, see www.adni-info.org . Analyses were done on proteins that overlapped with those available in BioFINDER-2, for a total of 1,438 aptamers. Positivity for Aβ-PET was taken from the ‘AMYLOID_STATUS’ variable, reflecting different tracer-dependent thresholds for florbetapir or florbetaben, based on the most updated methods provided by ADNI (see Amyloid PET processing methods v.2, revised 29 June 2023). Tau-PET was not available in the participants with proteomics. We thus performed differential protein abundance analyses between A − and A + participants including age and sex as covariates.

Imaging transcriptomics relating regional gene expression and AD pathology

To further investigate the relationship between key DAPs and AD pathology, we performed region-wise association between gene expression of the DAPs across brain regions and average Aβ- and tau-PET across the same brain regions. Gene expression was generated using the regional microarray expression data obtained from six postmortem brains provided by the AHBA 53 ( https://human.brain-map.org ). Data were processed with the abagen toolbox v.0.1.3 (ref. 54 ) ( https://abagen.readthedocs.io/en/stable/index.html ), a Python-based package to access and work with the Allen Brain data microarray expression data. Briefly, processing steps involved collapsing data into ROIs (parcels of a specified brain atlas) and combining across donors. Regional microarray expression data were generated for two brain atlases—the Desikan–Killiany atlas and the Schaefer 200 parcels—both in MNI (Montreal Neurological Institute) space. The abagen default parameters were used, with the exception that, for the Desikan–Killiany atlas, gene normalization was done within structural classes (cortical, subcortical/brainstem), given that gene expression can differ in cortical versus subcortical regions 53 , 55 . All details of the abagen workflow for generating these data are as follows: first, microarray probes were reannotated using data provided by Arnatkevic̆iūtė et al. 55 ; probes not matched to a valid Entrez ID were discarded. Next, probes were filtered based on their expression intensity relative to background noise 56 , such that probes with intensity less than the background in ≥50% of samples across donors were discarded, yielding 31,569 probes. When multiple probes indexed the expression of the same gene, we selected and used the probe with the most consistent pattern of regional variation across donors (that is, differential stability). In the present study, regions correspond to the structural designations provided in the ontology from the AHBA. The MNI coordinates of tissue samples were updated to those generated via nonlinear registration using the Advanced Normalization Tools. Samples were assigned to brain regions in the provided atlas if their MNI coordinates were within 2 mm of a given parcel. To reduce the potential for misassignment, sample-to-region matching was constrained by hemisphere and gross structural divisions (that is, cortex, subcortex/brainstem and cerebellum, such that, for example, a sample in the left cortex could be assigned only to an atlas parcel in the left cortex 55 ). All tissue samples not assigned to a brain region in the provided atlas were discarded, yielding a total of 83 regions. Intersubject variation was addressed by normalizing tissue sample expression values across genes using a robust sigmoid function and then rescaled to the unit interval. Gene expression values were then normalized across tissue samples using an identical procedure. Normalization was performed separately for samples in distinct structural classes (that is, cortex, subcortex/brainstem and cerebellum). Samples assigned to the same brain region were averaged separately for each donor and then across donors, yielding a regional expression matrix with brain regions as rows and genes as columns.

To get regional levels of AD neuropathology, for each region of the Desikan–Killiany (all cortical regions, hippocampus and amygdala) and Schaefer 200 atlases, we averaged Aβ- and tau-PET across Aβ-positive participants. The main correlational analyses were performed across all DAPs previously identified. To ensure that analyses were not driven by statistical autocorrelations, we performed null modeling with BrainSMASH ( https://brainsmash.readthedocs.io/en/latest/index.html ), a Python-based package for statistical testing of spatially autocorrelated brain maps 57 . BrainSMASH simulates surrogate brain maps with spatial autocorrelation that matches the spatial autocorrelation of the original brain map of interest, in the present study the Aβ- or tau-PET map. We used a Euclidean distance matrix of each atlas as input to generate 1,000 surrogate maps preserving autocorrelation, otherwise using the default BrainSMASH parameters. Pearson’s correlation was then computed region-wise between (1) the original maps of interest, that is the Aβ/tau-PET and the gene expression maps, and (2) each surrogate PET map and the original gene expression map, to create a null distribution. Nonparametric P values were then computed, corresponding to the frequency that correlation with the surrogate maps exceeded the observed correlation with the original gene expression map. Genes were considered to have expression profiles significantly related to pathology when they had BrainSMASH-corrected P values < 0.05 across both atlases.

Cell-type enrichment analysis

To measure cell-type expression of the different genes of interest, we used the RNA sequencing (RNA-seq) data from the middle temporal gyrus (MTG) available from the Allen Brain Atlas consortium (Human MTG 10x SEA-AD 2022, https://portal.brain-map.org/atlases-and-data/rnaseq/human-mtg-10x_sea-ad ), applying a similar approach to that described previously 58 . This dataset includes single-nucleus transcriptomes from 166,868 total nuclei of five postmortem human brain specimens. We downloaded the Seurat object and applied the function AverageExpression from the R package Seurat v.4.3.0 (refs. 59 , 60 ) to generate cell-type expression levels. Using the class and subclass annotation available from the Allen Brain data, we calculated the average expression from non-neuronal cells (microglia, astrocytes, oligodendrocytes and OPCs) and neuronal cells (GABA-ergic neurons and glutamatergic neurons), after removing the annotations labeled ‘None’. We then calculated the percentage expression across all these cell types from the average expression.

We also performed cell-type enrichment analysis using the Allen Brain MTG data as the cell-type dataset in the R package Expression Weighted Cell Type Enrichment (EWCE) v.1.6.0 (ref. 61 ). We performed bootstrap cell-type enrichment analyses ( n  = 1,000), providing a list of key gene hits and defining the background set at the 1,331 Olink proteins used across analyses. The cell types included were microglia, astrocytes, oligodendrocytes, OPCs, GABA-ergic neurons and glutamatergic neurons. Enrichment was considered significant if the surviving adjustment for multiple comparisons was based on a P FDR  < 0.05 (Benjamini–Hochberg method).

Differential gene expression

We used the recently published single-nucleus dataset of 2.3 million nuclei from the prefrontal cortex of 427 ROSMAP participants to investigate differential gene expression of some of the key proteins identified in relation to postmortem AD pathology 20 . Differential gene expression across all cell types and AD pathology was downloaded from Github ( https://github.com/mathyslab7/ROSMAP_snRNAseq_PFC/tree/main/Results/DecontX_RUVr_Differential_gene_expression_analysis/muscat_DecontX/Results ). We considered differential gene expression in relation to pathology to be significant for P FDR  < 0.05, correcting for multiple comparisons across all cell types and all measures of pathology.

The average gene expression within each of the main cell types (microglia, astrocytes, oligodendrocytes, OPCs, GABA-ergic neurons and glutamatergic neurons) was also generated in this dataset using Seurat as described above. Those measures were derived using the processed data (single-nucleus RNA sequencing (snRNA-seq) 10×) for each cell type available on Synapse ( https://www.synapse.org/#!Synapse:syn52293433 ). We generated the average expression across (1) the whole dataset of 427 participants, (2) only A − donors based on a CERAD (Consortium to Establish a Registry for Alzheimer's Disease) score of possible or no AD and (3) only A + donors based on a CERAD score of probable or definite.

Functional enrichment analyses

For functional enrichment analyses, we used the WEB-based Gene SeTAnaLysis Toolkit (WebGestalt: https://www.webgestalt.org ) 62 , 63 . We performed human over-representation analyses using the GO database for biological process (BP) and cellular component (CC), defining the background set as the 1,331 Olink proteins used across analyses. We used the default parameters from WebGestalt and reported only terms that survived adjustment for multiple comparisons based on a P FDR  < 0.05 (Benjamini–Hochberg method). To reduce term redundancy in displaying the main results, we used the affinity propagation option from the R package apcluster to cluster gene sets. All individual significant GO terms are provided in Supplementary Tables 6–8 .

Inferred trajectory using pseudotime

We created a pseudotime variable based on AD pathology, to be able to compare protein trajectories along a continuous measure of AD progression. The method used was adapted from Tasaki et al. 36 . Input variables for the pseudotime included state-of-the-art biomarkers of AD pathology which showed the greatest availability across the full sample of BioFINDER-2 participants ( n  = 1504), to derive the AD pseudotime as robustly as possible. Specifically, we used (1) plasma tau phosphorylated at Thr217, which has recently been shown to be slightly better related to Aβ pathology in the brain than CSF Aβ42/Aβ40 (ref. 32 ) and (2) tau-PET SUVR from three meta-ROIs representing Braak stages 64 (Braak I–II, III–IV and V–VI). First, we applied principal component analysis (PCA) to the different AD biomarkers and the four PCs were included as input to a spectral embedding analysis. These steps were implemented in the scikit-learn package 65 (v.0.24.2) in Python. The spectral embedding analysis reduced the data nonlinearly to a latent two-dimensional (2D) space in which individuals were embedded based on their biomarker values. We inferred a trajectory across this 2D space using SCORPIUS 66 v.1.0.8, a package on R to infer chronologies in an unsupervised manner, which is one of the best performing tools in trajectory inference methods 67 . Briefly, SCORPIUS partitioned samples into clusters and optimized the shortest and smoothest path going through the center of these clusters. Each datapoint (participant) is thus assigned a coordinate along this pseudotime trajectory, going from 0, representing the lowest levels of pathology, to 1, representing the highest levels of pathology. Individual key proteins or the average of protein modules was then plotted against the pseudotime using smoothed generalized additive model (GAM) lines from ggplot2 to estimate the trajectories of DAPs across a granular estimate of AD progression.

Co-expression modules analysis

We derived modules of co-expressed proteins using the consensus clustering algorithm SpeakEasy, which allows for robust community detection. We regressed out age, sex and mean overall protein level from each protein level (the covariates used across all analyses) and used the standardized residuals from each protein to generate a protein–protein correlation matrix based on Spearman’s correlation (1,331 × 1,331 proteins), which served as input for clustering. We also applied PCA to Spearman’s correlation matrix of the residual protein data to investigate the data organization. The first PC, explaining 13% of the variance, showed a strong anti-correlation with protein–protein correlation, potentially masking more modular and disease-relevant signal. We thus further regressed PC1 from the data before applying SpeakEasy. The investigator was blinded to any sample and protein label when deriving the modules. We computed whole-module expression by taking the mean protein level across all proteins within each module for each participant.

Immunofluorescent staining from postmortem brain tissue

Immunofluorescent staining from entorhinal cortex tissue from six patients with clinically diagnosed AD ( n  = 6, age 68–91 years, three women, Braak IV–VI, all with elevated amyloid plaques) and six nondemented controls ( n  = 6, age 72–92 years, five women, all Braak I or II, no or some amyloid plaques) from the Netherlands Brain Bank was performed for SMOC1, as well as markers for AD pathology (plaques and neurofibrillary tangles).

All donors provided informed consent for brain autopsies and the use of material and clinical data for research purposes, in compliance with national ethical guidelines. The samples were immersion-fixed in 4% paraformaldehyde on autopsy and left in phosphate-buffered saline (PBS) with 30% sucrose for 3 d. Thereafter the tissue was sectioned with a microtome (Leica SM 2010R) into 40-µm-thick sections and kept free floating in antifreeze solution at −20 °C.

For immunofluorescent staining against SMOC1 (Merck, cat. no. WH0064093M3, clone 8F10), p-tau 231 (Abcam, cat. no. ab151559) and methoxy-X04 (Tocris Biotechne, cat. no. 4920), sections of entorhinal cortex were incubated in citrate buffer 30% at 80 °C. After rinsing in potassium PBS (KPBS) the sections were incubated in blocking solution (5% normal goat serum in KPBS) at room temperature followed by an incubation in primary antibodies in blocking solution overnight at 4 °C. Afterwards, the tissue was washed with KPBS, 0.25% Triton X-100 and incubated in the appropriate secondary antibody (goat anti-rabbit 488, 1:200 (Invitrogen, cat. no. A11008) and goat anti-mouse 549, 1:200 (Invitrogen, cat. no. A11029)) in blocking solution for 24 h at 4 °C. After washing with KPBS, the tissue was stained with the ligand methoxy-X04 diluted 1:5,000 in PBS for 1 h at room temperature.

We compared optical density (OD) area fraction between AD and nondemented controls. Analysis of the SMOC1-immunostained area fraction was performed by capturing six images of the entorhinal cortex in each donor with an Olympus BX41 light microscope with ×40 objective (72 fields in total). The area fraction of SMOC1 in each image was analyzed using imageJ v.1.54g and the values of each case were averaged and presented as percentage OD area fraction.

Statistics and reproducibility

All analyses were performed in R v.4.2.1 or Python v.3.9.2. All plots were generated with the R package ggplot2 v.3.4.0. Brain renderings were generated with the Connectome Workbench software v.1.5.0. All regression analyses included age, sex and mean overall protein level as covariates. No statistical method was used to predetermine sample size; we used all participants who had proteomic and biomarker data available. Plots displaying levels of individual proteins or average from modules correspond to residual NPX values when regressing out the three covariates (age, sex and mean overall protein level). All statistical tests were two sided and P values adjusted for FDR.

Reporting summary

Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.

Data availability

Pseudonymized data from BioFINDER (Principal Investigator: O.H.) can be shared with qualified academic researchers after a request for the purpose of replicating procedures and results presented in the present study. In line with the EU General Data Protection Regulation legislation, a data transfer agreement must be established with Skåne University Hospital (Region Skåne) to share data. The agreement will include terms of how data are stored, protected and accessed, and define what the receiver can or cannot do. The proposed analyses must be compliant with decisions made by the Swedish Ethical Review Authority. This procedure is in place to ensure the anonymity of the participants, who have not consented to open sharing of their information, and to ensure that all data analyses are restricted to the ones agreed by the study participants and the Swedish Ethical Review Authority. ADNI data used in this manuscript are publicly available from the ADNI database ( adni.loni.usc.edu ) on registration and compliance with the data use agreement. SnRNA-seq data from ROSMAP are available at https://www.synapse.org/#!Synapse:syn52293433 on data use agreement. SnRNA-seq from the Allen Brain Institute is openly available at https://portal.brain-map.org/atlases-and-data/rnaseq . Summary statistics from all analyses are provided in the Supplementary Information .

Code availability

All code was written using available packages in R and Python and has been provided at https://github.com/alexapichet/NatureNeuro_2024_proteomics .

Braak, H. & Braak, E. Neuropathological stageing of Alzheimer-related changes. Acta Neuropathol. 82 , 239–259 (1991).

Article   CAS   PubMed   Google Scholar  

Duyckaerts, C., Delatour, B. & Potier, M. C. Classification and basic pathology of Alzheimer disease. Acta Neuropathol. 118 , 5–36 (2009).

Jack, C. R. Jr. et al. Tracking pathophysiological processes in Alzheimer’s disease: an updated hypothetical model of dynamic biomarkers. Lancet Neurol. 12 , 207–216 (2013).

Article   CAS   PubMed   PubMed Central   Google Scholar  

Bellenguez, C. et al. New insights into the genetic etiology of Alzheimer’s disease and related dementias. Nat. Genet. 54 , 412–436 (2022).

van der Kant, R., Goldstein, L. S. B. & Ossenkoppele, R. Amyloid-beta-independent regulators of tau pathology in Alzheimer disease. Nat. Rev. Neurosci. 21 , 21–35 (2020).

Article   PubMed   Google Scholar  

Del Campo, M. et al. CSF proteome profiling across the Alzheimer’s disease spectrum reflects the multifactorial nature of the disease and identifies specific biomarker panels. Nat. Aging 2 , 1040–1053 (2022).

Article   PubMed   PubMed Central   Google Scholar  

Johnson, E. C. B. et al. Large-scale deep multi-layer analysis of Alzheimer’s disease brain reveals strong proteomic disease-related changes not observed at the RNA level. Nat. Neurosci. 25 , 213–225 (2022).

Higginbotham, L. et al. Integrated proteomics reveals brain-based cerebrospinal fluid biomarkers in asymptomatic and symptomatic Alzheimer’s disease. Sci. Adv. 6 , eaaz9360 (2020).

van der Ende, E. L. et al. CSF proteomics in autosomal dominant Alzheimer’s disease highlights parallels with sporadic disease. Brain 146 , 4495–4507 (2023).

Johnson, E. C. B. et al. Cerebrospinal fluid proteomics define the natural history of autosomal dominant Alzheimer’s disease. Nat. Med. 29 , 1979–1988 (2023).

Haque, R. et al. A protein panel in cerebrospinal fluid for diagnostic and predictive assessment of Alzheimer’s disease. Sci. Transl. Med. 15 , eadg4122 (2023).

Sung, Y. J. et al. Proteomics of brain, CSF, and plasma identifies molecular signatures for distinguishing sporadic and genetic Alzheimer’s disease. Sci. Transl. Med. 15 , eabq5923 (2023).

Salvado, G. et al. Specific associations between plasma biomarkers and postmortem amyloid plaque and tau tangle loads. EMBO Mol. Med. 15 , e17123 (2023).

Therriault, J. et al. Association of phosphorylated tau biomarkers with amyloid positron emission tomography vs tau positron emission tomography. JAMA Neurol. 80 , 188–199 (2023).

Smith, R. et al. Tau-PET is superior to phospho-tau when predicting cognitive decline in symptomatic AD patients. Alzheimers Dement. 19 , 2497–2507 (2023).

Ossenkoppele, R. et al. Amyloid and tau PET-positive cognitively unimpaired individuals are at high risk for future cognitive decline. Nat. Med. 28 , 2381–2387 (2022).

Mattsson-Carlgren, N. et al. Cerebrospinal fluid biomarkers in autopsy-confirmed Alzheimer disease and frontotemporal lobar degeneration. Neurology 98 , e1137–e1150 (2022).

Fleisher, A. S. et al. Positron emission tomography imaging with [ 18 F]flortaucipir and postmortem assessment of Alzheimer disease neuropathologic changes. JAMA Neurol. 77 , 829–839 (2020).

Smith, R., Wibom, M., Pawlik, D., Englund, E. & Hansson, O. Correlation of in vivo [ 18 F]flortaucipir with postmortem Alzheimer disease tau pathology. JAMA Neurol. 76 , 310–317 (2019).

Mathys, H. et al. Single-cell atlas reveals correlates of high cognitive function, dementia, and resilience to Alzheimer’s disease pathology. Cell 186 , 4365–4385.e4327 (2023).

Pereira, J. B. et al. DOPA decarboxylase is an emerging biomarker for Parkinsonian disorders including preclinical Lewy body disease. Nat. Aging 3 , 1201–1209 (2023).

Del Campo, M. et al. CSF proteome profiling reveals biomarkers to discriminate dementia with Lewy bodies from Alzheimer s disease. Nat. Commun. 14 , 5635 (2023).

Gaiteri, C. et al. Robust, scalable, and informative clustering for diverse biological networks. Genome Biol. 24 , 228 (2023).

Olah, M. et al. Single cell RNA sequencing of human microglia uncovers a subset associated with Alzheimer’s disease. Nat. Commun. 11 , 6129 (2020).

Watson, C. M. et al. Quantitative mass spectrometry analysis of cerebrospinal fluid protein biomarkers in Alzheimer’s disease. Sci. Data 10 , 261 (2023).

Panyard, D. J. et al. Large-scale proteome and metabolome analysis of CSF implicates altered glucose and carbon metabolism and succinylcarnitine in Alzheimer’s disease. Alzheimers Dement . 19 , 5447–5470 (2023).

Sperling, R. A. et al. Toward defining the preclinical stages of Alzheimer’s disease: recommendations from the National Institute on Aging-Alzheimer’s Association workgroups on diagnostic guidelines for Alzheimer’s disease. Alzheimers Dement. 7 , 280–292 (2011).

Bai, B. et al. Deep multilayer brain proteomics identifies molecular networks in Alzheimer’s disease progression. Neuron 105 , 975–991.e977 (2020).

Drummond, E. et al. The amyloid plaque proteome in early onset Alzheimer’s disease and Down syndrome. Acta Neuropathol. Commun. 10 , 53 (2022).

Zhang, P. et al. Senolytic therapy alleviates Abeta-associated oligodendrocyte progenitor cell senescence and cognitive deficits in an Alzheimer’s disease model. Nat. Neurosci. 22 , 719–728 (2019).

Mila-Aloma, M. et al. Plasma p-tau231 and p-tau217 as state markers of amyloid-beta pathology in preclinical Alzheimer’s disease. Nat. Med. 28 , 1797–1801 (2022).

CAS   PubMed   PubMed Central   Google Scholar  

Barthelemy, N. R. et al. CSF tau phosphorylation occupancies at T217 and T205 represent improved biomarkers of amyloid and tau pathology in Alzheimer’s disease. Nat. Aging 3 , 391–401 (2023).

Busche, M. A. et al. Tau impairs neural circuits, dominating amyloid-beta effects, in Alzheimer models in vivo. Nat. Neurosci. 22 , 57–64 (2019).

Angulo, S. L. et al. Tau and amyloid-related pathologies in the entorhinal cortex have divergent effects in the hippocampal circuit. Neurobiol. Dis. 108 , 261–276 (2017).

Targa Dias Anastacio, H., Matosin, N. & Ooi, L. Neuronal hyperexcitability in Alzheimer’s disease: what are the drivers behind this aberrant phenotype? Transl. Psychiatry 12 , 257 (2022).

Tasaki, S. et al. Inferring protein expression changes from mRNA in Alzheimer’s dementia using deep neural networks. Nat. Commun. 13 , 655 (2022).

Desole, C. et al. HGF and MET: from brain development to neurological disorders. Front. Cell Dev. Biol. 9 , 683609 (2021).

PathCards: Pathway Unification Database. Hepatocyte Growth Factor Receptor Signaling. Genecards https://pathcards.genecards.org/Card/hepatocyte_growth_factor_receptor_signaling?queryString=HGF (Weizmann Institute of Science, 2023).

Modeste, E. S. et al. Quantitative proteomics of cerebrospinal fluid from African Americans and Caucasians reveals shared and divergent changes in Alzheimer’s disease. Mol. Neurodegen. 18 , 48 (2023).

Article   CAS   Google Scholar  

Palmqvist, S. et al. Discriminative accuracy of plasma phospho-tau217 for Alzheimer disease vs other neurodegenerative disorders. JAMA 324 , 772–781 (2020).

Palmqvist, S. et al. Cognitive effects of Lewy body pathology in clinically unimpaired individuals. Nat. Med. 29 , 1971–1978 (2023).

American Psychiatric Association (ed.) Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition Text Revision (APA Publishing, 2022).

Jack, C. R. et al. NIA-AA research framework: toward a biological definition of Alzheimer’s disease. Alzheimer’s Dement. 14 , 535–562 (2018).

Article   Google Scholar  

Wik, L. et al. Proximity extension assay in combination with next-generation sequencing for high-throughput proteome-wide analysis. Mol. Cell Proteom. 20 , 100168 (2021).

Olink. White paper—Data normalization and standardization (2022); https://olink.com/knowledge/documents

Sun, B. B., Chiou, J. & Traylor, M. et al. Plasma proteomic associations with genetics and health in the UK Biobank. Nature 622 , 329–338 (2023).

Karlsson, L., Vogel, J. & Arvidsson, I. et al. Cerebrospinal fluid reference proteins increase accuracy and interpretability of biomarkers for brain diseases. Nat. Commun. 15 , 3676 (2024).

Hansson, O. et al. CSF biomarkers of Alzheimer’s disease concord with amyloid-beta PET and predict clinical progression: a study of fully automated immunoassays in BioFINDER and ADNI cohorts. Alzheimers Dement . 14 , 1470–1481 (2018).

Gobom, J. et al. Validation of the LUMIPULSE automated immunoassay for the measurement of core AD biomarkers in cerebrospinal fluid. Clin. Chem. Lab. Med. 60 , 207–219 (2022).

Leuzy, A. et al. Diagnostic performance of RO948 F 18 tau positron emission tomography in the differentiation of Alzheimer disease From other neurodegenerative disorders. JAMA Neurol. 77 , 955–965 (2020).

Jack, C. R. Jr. et al. Defining imaging biomarker cut points for brain aging and Alzheimer’s disease. Alzheimers Dement. 13 , 205–216 (2017).

Janelidze, S. et al. Plasma P-tau181 in Alzheimer’s disease: relationship to other biomarkers, differential diagnosis, neuropathology and longitudinal progression to Alzheimer’s dementia. Nat. Med. 26 , 379–386 (2020).

Hawrylycz, M. J. et al. An anatomically comprehensive atlas of the adult human brain transcriptome. Nature 489 , 391–399 (2012).

Markello, R., Shafiei, G., Zheng, Y.-Q. & Mišić, B. abagen: A toolbox for the Allen Brain Atlas genetics data v. 0.1.3. Zenodo https://zenodo.org/records/5129257 (2021).

Arnatkeviciute, A., Fulcher, B. D. & Fornito, A. A practical guide to linking brain-wide gene expression and neuroimaging data. NeuroImage 189 , 353–367 (2019).

Quackenbush, J. Microarray data normalization and transformation. Nat. Genet. 32 , 496–501 (2002).

Burt, J. B., Helmer, M., Shinn, M., Anticevic, A. & Murray, J. D. Generative modeling of brain maps with spatial autocorrelation. NeuroImage 220 , 117038 (2020).

Marques-Coelho, D. et al. Differential transcript usage unravels gene expression alterations in Alzheimer’s disease human brains. NPJ Aging Mech. Dis. 7 , 2 (2021).

Satija, R., Farrell, J. A., Gennert, D., Schier, A. F. & Regev, A. Spatial reconstruction of single-cell gene expression data. Nat. Biotechnol. 33 , 495–502 (2015).

Hao, Y. et al. Integrated analysis of multimodal single-cell data. Cell 184 , 3573–3587.e3529 (2021).

Skene, N. G. & Grant, S. G. Identification of vulnerable cell types in major brain disorders using single cell transcriptomes and expression weighted cell type enrichment. Front. Neurosci. 10 , 16 (2016).

Liao, Y., Wang, J., Jaehnig, E. J., Shi, Z. & Zhang, B. WebGestalt 2019: gene set analysis toolkit with revamped UIs and APIs. Nucleic Acids Res. 47 , W199–W205 (2019).

Wang, J., Vasaikar, S., Shi, Z., Greer, M. & Zhang, B. WebGestalt 2017: a more comprehensive, powerful, flexible and interactive gene set enrichment analysis toolkit. Nucleic Acids Res. 45 , W130–W137 (2017).

Cho, H. et al. In vivo cortical spreading pattern of tau and amyloid in the Alzheimer disease spectrum. Ann. Neurol. 80 , 247–258 (2016).

Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12 , 2825–2830 (2011).

Google Scholar  

Cannoodt, R. et al. SCORPIUS improves trajectory inference and identifies novel modules in dendritic cell development. Preprint at bioRxiv https://doi.org/10.1101/079509 (2016).

Saelens, W., Cannoodt, R., Todorov, H. & Saeys, Y. A comparison of single-cell trajectory inference methods. Nat. Biotechnol. 37 , 547–554 (2019).

Download references

Acknowledgements

We thank all the BioFINDER team members as well as participants in the study and their family members for their dedication. We thank the Netherlands Brain Bank, Netherlands Institute for Neuroscience, Amsterdam for supplying the brain tissue and neuropathological analyses. Acknowledgement is also made to the donors of the Alzheimer’s Disease Research, a program of the BrightFocus Foundation, for support of this research (grant no. A2021013F to A.P.B.). A.P.B. is supported by a postdoctoral fellowship from the Fonds de recherche en Santé Québec (grant no. 298314). The BioFINDER study was supported by the Swedish Research Council (grant nos. 2022-00775 to O.H. and 2021-02219 to N.M.-C.), ERA PerMed (grant no. ERAPERMED2021-184 to O.H.), the Knut and Alice Wallenberg foundation (grant no. 2017-0383 to O.H.), the Strategic Research Area MultiPark (Multidisciplinary Research in Parkinson’s disease) at Lund University, the Swedish Alzheimer Foundation (grant nos. AF-980907 to O.H. and AF-994229 to N.M.-C.), the Swedish Brain Foundation (grant nos. FO2021-0293 to O.H. and FO2023-0163 to N.M.-C.), the Parkinson Foundation of Sweden (grant no. 1412/22 to O.H.), the Cure Alzheimer’s fund, the Konung Gustaf V:s och Drottning Victorias Frimurarestiftelse, the Skåne University Hospital Foundation (grant no. 2020-O000028 to O.H.), Regionalt Forskningsstöd (grant no. 2022-1259 to O.H.), the EU Joint Programme Neurodegenerative Diseases (grant no. 2019-03401 to N.M.-C.), the WASP and DDLS Joint call for research projects (grant no. WASP/DDLS22-066 to N.M.-C.) and the Swedish federal government under the ALF agreement (grant nos. 2022-Projekt0080 to O.H. and 2022-Projekt0107 to N.M.-C.). J.W.V. is supported by the SciLifeLab & Wallenberg Data Driven Life Science Program (grant no. KAW 2020.0239). The precursor of [ 18 F]flutemetamol was sponsored by GE Healthcare. The precursor of [ 18 F]RO948 was provided by Roche. L.-H.T. received funding from the US National Institutes of Health (grant nos. RF1AG062377, RF1 AG05432 and RO1 AG054012). The funding sources had no role in the design and conduct of the study, the collection, analysis and interpretation of the data or the preparation, review or approval of the manuscript.

Open access funding provided by Lund University.

Author information

Authors and affiliations.

Clinical Memory Research Unit, Department of Clinical Sciences Malmö, Lund University, Lund, Sweden

Alexa Pichet Binette, Atul Kumar, Ines Hristovska, Nicola Spotorno, Gemma Salvadó, Olof Strandberg, Sebastian Palmqvist, Niklas Mattsson-Carlgren, Shorena Janelidze, Erik Stomrud & Oskar Hansson

Department of Psychiatry, SUNY Upstate Medical University, Syracuse, NY, USA

Chris Gaiteri

Rush University Alzheimer’s Disease Center, Rush University, Chicago, IL, USA

Chris Gaiteri & Erik Stomrud

Cognitive Disorder Research Unit, Department of Clinical Sciences Malmö, Lund University, Malmö, Sweden

Malin Wennström

Picower Institute for Learning and Memory, MIT, Cambridge, MA, USA

Hansruedi Mathys & Li-Huei Tsai

Department of Brain and Cognitive Sciences, MIT, Cambridge, MA, USA

University of Pittsburgh Brain Institute and Department of Neurobiology, University of Pittsburgh School of Medicine, Pittsburgh, PA, USA

Hansruedi Mathys

Memory Clinic, Skåne University Hospital, Malmö, Sweden

Sebastian Palmqvist, Erik Stomrud & Oskar Hansson

Department of Neurology, Skåne University Hospital, Lund, Sweden

Niklas Mattsson-Carlgren

Wallenberg Center for Molecular Medicine, Lund University, Lund, Sweden

Department of Clinical Sciences Malmö, SciLifeLab, Lund University, Lund, Sweden

Jacob W. Vogel

You can also search for this author in PubMed   Google Scholar

Contributions

A.P.B., J.W.V. and O.H. designed the study and wrote the manuscript. A.P.B. had full access to raw data, carried out the final statistical analyses, did the QC and had the final responsibility of submitting for publication. S.P., E.S. and O.H. were responsible for clinical assessments and data collection. C.G. performed the co-expression module analysis and provided transcriptomics and bioinformatic expertise. M.W. performed immunofluorescence staining and analyses. H.M. and L.H.T. generated the ROSMAP single-nucleus data. O.S. processed the PET data. S.J. coordinated the CSF measurements. A.K., I.H., N.S., G.S. and N.M.-C. provided input on analyses and statistical models. All authors contributed to the interpretation of the results and critically reviewed the manuscript.

Corresponding authors

Correspondence to Alexa Pichet Binette or Oskar Hansson .

Ethics declarations

Competing interests.

O.H. has acquired research support (for the institution) from ADx, AVID Radiopharmaceuticals, Biogen, Eli Lilly, Eisai, Fujirebio, GE Healthcare, Pfizer and Roche. In the past 2 years, he has received consultancy/speaker fees from AC Immune, Amylyx, Alzpath, BioArctic, Biogen, Cerveau, Eisai, Eli Lilly, Fujirebio, Genentech, Merck, Novartis, Novo Nordisk, Roche, Sanofi and Siemens. S.P. has acquired research support (for the institution) from ki elements/ADDF. In the past 2 years, he has received consultancy/speaker fees from Bioartic, Biogen, Lilly and Roche. The remaining authors declare no competing interests.

Peer review

Peer review information.

Nature Neuroscience thanks Barbara Bendlin and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended data fig. 1 correlations between regional gene expression and pet patterns..

Differentially abundant proteins for which the regional gene expression (Allen Human Brain data) was related to either the regional tau- ( a ) or Aβ-PET ( b ) SUVR deposition. Only proteins significant after accounting for spatial autocorrelation and across two brain atlases (Desikan-Kiliany and Schaefer) are displayed. The only exception is MAPT (in gray), which was only significant in one atlas. CDH6 is in red as it is the only one with a negative correlation. Correlation coefficients and significance (two-sided test, no adjustment for multiple corrections) are shown from the Desikan-Kiliany atlas. See the Supplementary material for all detailed methods and results. * corresponds to p < 0.05, ** corresponds to p < 0.01.

Extended Data Fig. 2 Aβ and tau PET association in separate models.

a, b , Standardized beta coefficients from linear models relating AD fibrillar pathology at baseline (Aβ- and tau-PET SUVR included separately in different models) and over time (Aβ- and tau-PET rate of change included separately in different models) to the CSF protein levels of the 128 differentially expressed proteins. Models included age, sex, mean overall protein level as covariates. There was no significant associations with Aβ-PET rate of change, and thus those results are not displayed. Results in panel a included all CU and MCI participants. Results in panel b were restricted to Aβ-positive CU and MCI participants. All linear regressions performed were two-sided, and p-values were adjusted for false discovery rate. * corresponds to p FDR  < 0.05, ** corresponds to p FDR  < 0.01, *** corresponds to p FDR  < 0.001 Aβ = beta-amyloid; CU = cognitively unimpaired; MCI = mild cognitive impairment.

Extended Data Fig. 3 Average gene expression levels across cell types in the Allen Brain dataset.

Proportion of gene expression by cell type from single-cell transcriptomics data from the middle temporal gyrus from the differentially expressed proteins that were not included in Fig. 3b . To improve legibility, only average expression above 5% are displayed. Transcriptomics data was downloaded from the reference donors of the Allen Brain Institute.

Extended Data Fig. 4 Average gene expression levels across cell types in the ROSMAP single nuclei data.

Proportion of gene expression by cell type from single-nuclei data from the dorsolateral prefrontal gyrus of 427 ROSMAP donors. The 128 DAPs are depicted on the y-axis. Average expression was calculated across the whole sample (left column for each cell type, ‘Whole’), only A− donors (middle column for each cell type, ‘Neg’) and only A+ donors (right column for each cell type, ‘Pos’). The A− or A+ status was defined based on the CERAD score (A− being No AD or Possible; A+ being Definite or Probable). Transcriptomics data was accessed from Mathys et al, Cell, 2023.

Extended Data Fig. 5 Cell-type enrichment analyses.

a, b , Cell-type enrichment analyses based on single-cell transcriptomics data from the middle temporal gyrus based on the different categories of differentially expressed proteins ( a ) and taking all proteins part of the different biological modules ( b ). Bootstrap enrichment tests were performed, and all p-values were adjusted for false discovery rate. * corresponds to p FDR  < 0.05, ** corresponds to p FDR  < 0.01.

Extended Data Fig. 6 SMOC1 characterization in postmortem human tissue.

a , Differential gene expression for SMOC1 (strongest protein associated with Aβ) across all cell types and with five measures of AD pathology was accessed from Mathys et al, Cell , 2023, based on single-nuclei data from 427 ROSMAP donors. Only associations (two-sided regressions) surviving correction for false discovery rate across all cell types are depicted. Amyloid plaque and tau tangle loads were quantified with immunohistochemistry (IHC) (AT8 antibody for tau) and averaged across 8 brain regions. Diffuse plaques, neuritic places and neurofibrillary were assessed with silver-stained slides in 5 brain regions. Overall there were higher SMOC1 levels with greater postmortem AD pathology predominantly in OPCs, and to some extent in astrocytes and some inhibitory neuronal subtypes, reflecting the cell types where this gene is mainly expressed. b,c Representative images of entorhinal cortex from a nondemented control (NC) (b) and an Alzheimer’s disease (AD) (c) patient stained against SMOC1 (red) and the nuclei marker DAPI (blue). d , Optical density (OD) area fraction of SMOC1 in entorhinal cortex in six NC and six AD patients. Each point represents the mean of the OD area fraction in six images captured from each case. The data was analyzed using 2-sided student t-test. Significant difference at ***p < 0.001. e-g , Example of double staining against SMOC1 (red) and Methoxy-04 staining plaques (blue). e , SMOC1 staining. f, Aβ plaques staining. g , Merged image of SMOC1 and plaques. SMOC1 were found in some (long arrow), but not all (short arrow) Methoxy-04 stained plaques, which were found in three out of six AD cases. Overall, we found higher SMOC1 levels in brain tissue from AD cases compared to controls, with SMOC1 localized in amyloid plaques. Scale bar in (b-c) = 10 μm and in (g) = 50μm.

Extended Data Fig. 7 Trajectories of established AD-related proteins.

a , Box plots of four key AD-related proteins part of the differentially expressed proteins between the different AT categories (A-T-: n = 352; A + T−: n = 184; A + T + : n = 231) and non-AD neurodegenerative diseases (n = 110). Values correspond to residual values after regressing out age, sex and mean overall protein level. In all box plots, the box limits represent the first and third quartile, and the whisker extends to 1.5 time the interquartile range. The red dot represents the mean and the red line extends ±1 standard deviation. b , Proteins shown in panel a plotted against the pseudotime using generalized additive models. Error bands around the data correspond to the 95% confidence interval.

Extended Data Fig. 8 Microglia states across the different protein modules.

a , Correspondence of different microglial states from human single-cell data from Olah et al, Nature Communications , 2020 (stacked barplots) across the different BioFINDER-2 Proteomic modules (x-axis). For simplicity, all homeostatic states have been grouped together, and clusters 5, 6 and 7 have also been grouped together due to their high similarity as described in Olah et al. b, c , The predominance of microglia-related proteins in Module 2 was also confirmed in comparing two other sets of microglia genes derived from human aged brains: Patrick et al, Translational Psychiatry , 2021 ( b ) and Olah et al, Nature Communications , 2018 ( c ). Number of microglial proteins identified from two previous publications (y-axis) across the different Olink modules (x-axis). The number of proteins overlapping between Olink data and each of the three microglia data is reported in the titles.

Supplementary information

Supplementary information.

Supplementary Methods, Tables 1–8, Results and Figs. 1–5.

Reporting Summary

Supplementary table 1.

Supplementary Table 1 Summary statistics of differential protein abundance across the different A/T categories. Supplementary Table 2 Summary statistics of differential protein abundance between A − and A + participants in BioFINDER-1 (validation cohort). Supplementary Table 3 Summary statistics of differential protein abundance between A − and A + participants in ADNI (validation cohort). Supplementary Table 4 Summary statistics of imaging transcriptomics using BrainSmash. Supplementary Table 5 Summary statistics of associations between DAPs and Aβ- and tau-PET. Supplementary Table 6 GO significant terms from enrichment analyses of early, core and late proteins. Supplementary Table 7 GO significant terms from enrichment analyses of the downregulated proteins in A + T + and non-AD. Supplementary Table 8 GO significant terms from enrichment analyses in the co-expression modules.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Pichet Binette, A., Gaiteri, C., Wennström, M. et al. Proteomic changes in Alzheimer disease associated with progressive Aβ plaque and tau tangle pathologies. Nat Neurosci (2024). https://doi.org/10.1038/s41593-024-01737-w

Download citation

Received : 26 August 2023

Accepted : 23 July 2024

Published : 26 August 2024

DOI : https://doi.org/10.1038/s41593-024-01737-w

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

assignment define python

  • Stack Overflow for Teams Where developers & technologists share private knowledge with coworkers
  • Advertising & Talent Reach devs & technologists worldwide about your product, service or employer brand
  • OverflowAI GenAI features for Teams
  • OverflowAPI Train & fine-tune LLMs
  • Labs The future of collective knowledge sharing
  • About the company Visit the blog

Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Get early access and see previews of new features.

Can we have assignment in a condition?

Is it possible to have assignment in a condition?

Vishal's user avatar

  • 3 This is the first hit on google and the top answer is incorrect. Yes you can as of python.org/dev/peps/pep-0572 if a := somefunc(): #use a –  Rqomey Commented Sep 7, 2021 at 14:09

10 Answers 10

Why not try it out?

Update: This is possible (with different syntax) in Python 3.8

wjandrea's user avatar

  • 46 this is intentionally forbidden as Guido, benevolent python dictator, finds them unnecessary and more confusing than useful. It's the same reason there's no post-increment or pre-increment operators (++). –  Matt Boehm Commented Apr 8, 2010 at 22:53
  • 6 he did allow the addition of augmented assigment in 2.0 because x = x + 1 requires additional lookup time while x += 1 was somewhat faster, but i'm sure he didn't even like doing that much. :-) –  wescpy Commented Apr 8, 2010 at 23:56
  • 1 hope, this will become possible soon. Python is just slow to evolve –  Nik O'Lai Commented Aug 28, 2020 at 9:07
  • 55 "why not try it out" - Because who knows what the syntax might be? Maybe OP tried that and it didn't work, but that doesn't mean the syntax isn't different, or that there's not a way to do it that's not intended –  Levi H Commented Sep 22, 2020 at 17:09
  • 12 Fun fact: := (the assignment expression) is also known as the "walrus operator" because if considered as a text based emoji it looks like a walrus. :) –  jlsecrest Commented Nov 6, 2021 at 4:13

UPDATE - Original answer is near the bottom

Python 3.8 will bring in PEP572

Abstract This is a proposal for creating a way to assign to variables within an expression using the notation NAME := expr . A new exception, TargetScopeError is added, and there is one change to evaluation order.

https://lwn.net/Articles/757713/

The "PEP 572 mess" was the topic of a 2018 Python Language Summit session led by benevolent dictator for life (BDFL) Guido van Rossum. PEP 572 seeks to add assignment expressions (or "inline assignments") to the language, but it has seen a prolonged discussion over multiple huge threads on the python-dev mailing list—even after multiple rounds on python-ideas. Those threads were often contentious and were clearly voluminous to the point where many probably just tuned them out. At the summit, Van Rossum gave an overview of the feature proposal, which he seems inclined toward accepting, but he also wanted to discuss how to avoid this kind of thread explosion in the future.

https://www.python.org/dev/peps/pep-0572/#examples-from-the-python-standard-library

Examples from the Python standard library site.py env_base is only used on these lines, putting its assignment on the if moves it as the "header" of the block. Current: env_base = os.environ.get("PYTHONUSERBASE", None) if env_base: return env_base Improved: if env_base := os.environ.get("PYTHONUSERBASE", None): return env_base _pydecimal.py Avoid nested if and remove one indentation level. Current: if self._is_special: ans = self._check_nans(context=context) if ans: return ans Improved: if self._is_special and (ans := self._check_nans(context=context)): return ans copy.py Code looks more regular and avoid multiple nested if. (See Appendix A for the origin of this example.) Current: reductor = dispatch_table.get(cls) if reductor: rv = reductor(x) else: reductor = getattr(x, "__reduce_ex__", None) if reductor: rv = reductor(4) else: reductor = getattr(x, "__reduce__", None) if reductor: rv = reductor() else: raise Error( "un(deep)copyable object of type %s" % cls) Improved: if reductor := dispatch_table.get(cls): rv = reductor(x) elif reductor := getattr(x, "__reduce_ex__", None): rv = reductor(4) elif reductor := getattr(x, "__reduce__", None): rv = reductor() else: raise Error("un(deep)copyable object of type %s" % cls) datetime.py tz is only used for s += tz, moving its assignment inside the if helps to show its scope. Current: s = _format_time(self._hour, self._minute, self._second, self._microsecond, timespec) tz = self._tzstr() if tz: s += tz return s Improved: s = _format_time(self._hour, self._minute, self._second, self._microsecond, timespec) if tz := self._tzstr(): s += tz return s sysconfig.py Calling fp.readline() in the while condition and calling .match() on the if lines make the code more compact without making it harder to understand. Current: while True: line = fp.readline() if not line: break m = define_rx.match(line) if m: n, v = m.group(1, 2) try: v = int(v) except ValueError: pass vars[n] = v else: m = undef_rx.match(line) if m: vars[m.group(1)] = 0 Improved: while line := fp.readline(): if m := define_rx.match(line): n, v = m.group(1, 2) try: v = int(v) except ValueError: pass vars[n] = v elif m := undef_rx.match(line): vars[m.group(1)] = 0 Simplifying list comprehensions A list comprehension can map and filter efficiently by capturing the condition: results = [(x, y, x/y) for x in input_data if (y := f(x)) > 0] Similarly, a subexpression can be reused within the main expression, by giving it a name on first use: stuff = [[y := f(x), x/y] for x in range(5)] Note that in both cases the variable y is bound in the containing scope (i.e. at the same level as results or stuff). Capturing condition values Assignment expressions can be used to good effect in the header of an if or while statement: # Loop-and-a-half while (command := input("> ")) != "quit": print("You entered:", command) # Capturing regular expression match objects # See, for instance, Lib/pydoc.py, which uses a multiline spelling # of this effect if match := re.search(pat, text): print("Found:", match.group(0)) # The same syntax chains nicely into 'elif' statements, unlike the # equivalent using assignment statements. elif match := re.search(otherpat, text): print("Alternate found:", match.group(0)) elif match := re.search(third, text): print("Fallback found:", match.group(0)) # Reading socket data until an empty string is returned while data := sock.recv(8192): print("Received data:", data) Particularly with the while loop, this can remove the need to have an infinite loop, an assignment, and a condition. It also creates a smooth parallel between a loop which simply uses a function call as its condition, and one which uses that as its condition but also uses the actual value. Fork An example from the low-level UNIX world: if pid := os.fork(): # Parent code else: # Child code

Original answer

http://docs.python.org/tutorial/datastructures.html

Note that in Python, unlike C, assignment cannot occur inside expressions. C programmers may grumble about this, but it avoids a common class of problems encountered in C programs: typing = in an expression when == was intended.

http://effbot.org/pyfaq/why-can-t-i-use-an-assignment-in-an-expression.htm

John La Rooy's user avatar

  • 1 I like this answer because it actually points out why such a "feature" might have been deliberately left out of Python. When teaching beginner's programming, I have seen many make this mistake if (foo = 'bar') while intending to test the value of foo . –  Jonathan Cross Commented Feb 19, 2019 at 23:25
  • 3 @JonathanCross, This "feature" is actually going to be added in 3.8. It is unlikely to be used as sparingly as it should - but at least it is not a plain = –  John La Rooy Commented Feb 19, 2019 at 23:52
  • @JohnLaRooy: Looking at the examples, I think "unlikely to be used as sparingly as it should" was spot-on; Out of the ~10 examples, I find that only two actually improve the code. (Namely, as the sole expression either in a while condition, to avoid duplicating the line or having the loop condition in the body, or in an elif-chain to avoid nesting) –  Aleksi Torhamo Commented Apr 10, 2020 at 11:07
  • Doesn't work with ternary a if a := 1 == 1 else False → Invalid syntax :c So apparently still have to resort to the old awkward (lambda: (ret := foo(), ret if ret else None))()[-1] (i.e. as a way to avoid the repeated call to foo() in ternary) . –  Hi-Angel Commented Nov 25, 2020 at 21:20
  • 3 @Hi-Angel This actually works fine (Python 3.10.8). You just have to add parentheses according to what you actually want to express, i.e. either a if (a := 1) == 1 else False (yields 1 ) or a if (a := 1 == 1) else False (yields True ). –  user686249 Commented Nov 29, 2022 at 15:54

Nope, the BDFL didn't like that feature.

From where I sit, Guido van Rossum, "Benevolent Dictator For Life”, has fought hard to keep Python as simple as it can be. We can quibble with some of the decisions he's made -- I'd have preferred he said 'No' more often. But the fact that there hasn't been a committee designing Python, but instead a trusted "advisory board", based largely on merit, filtering through one designer's sensibilities, has produced one hell of a nice language, IMHO.

Kevin Little's user avatar

  • 20 Simple? This feature could simplified quite some of my code because it could have made it more compact and therefor more readable. Now I need two lines where I used to need one. I never got the point why Python rejected features other programming languages have for many years (and often for a very good reason). Especially this feature we're talking about here is very, very useful. –  Regis May Commented Sep 10, 2017 at 8:09
  • 7 Less code isn't always simpler or morde readable. Take a recursive function for example. It's loop-equivalent is often more readable. –  F.M.F. Commented Apr 9, 2018 at 8:50
  • 1 I don't like like the C version of it, but I really miss having something like rust's if let when I have an if elif chain, but need to store and use the value of the condition in each case. –  Thayne Commented Jul 10, 2018 at 4:57
  • 1 I have to say the code I am writing now (the reason I searched this issue) is MUCH uglier without this feature. Instead of using if followed by lots of else ifs, I need to keep indenting the next if under the last else. –  MikeKulls Commented Oct 24, 2018 at 3:34

Yes, but only from Python 3.8 and onwards.

PEP 572 proposes Assignment Expressions and has already been accepted.

Quoting the Syntax and semantics part of the PEP:

In your specific case, you will be able to write

timgeb's user avatar

Not directly, per this old recipe of mine -- but as the recipe says it's easy to build the semantic equivalent, e.g. if you need to transliterate directly from a C-coded reference algorithm (before refactoring to more-idiomatic Python, of course;-). I.e.:

BTW, a very idiomatic Pythonic form for your specific case, if you know exactly what falsish value somefunc may return when it does return a falsish value (e.g. 0 ), is

so in this specific case the refactoring would be pretty easy;-).

If the return could be any kind of falsish value (0, None , '' , ...), one possibility is:

but you might prefer a simple custom generator:

Alex Martelli's user avatar

  • I would vote this up twice if I could. This is a great solution for those times when something like this is really needed. I adapted your solution to a regex Matcher class, which is instantiated once and then .check() is used in the if statement and .result() used inside its body to retrieve the match, if there was one. Thanks! :) –  Teekin Commented Jul 26, 2018 at 15:23

Thanks to Python 3.8 new feature it will be possible to do such a thing from this version, although not using = but Ada-like assignment operator := . Example from the docs:

Jean-François Fabre's user avatar

No. Assignment in Python is a statement, not an expression.

Ignacio Vazquez-Abrams's user avatar

  • And Guido wouldn't have it any other way. –  Mark Ransom Commented Apr 8, 2010 at 22:51
  • 1 @MarkRansom All hail Guido. Right .. sigh. –  WestCoastProjects Commented Dec 11, 2017 at 4:49
  • @javadba the guy has been right much more often than he's been wrong. I appreciate that having a single person in charge of the vision results in a much more coherent strategy than design by committee; I can compare and contrast with C++ which is my main bread and butter. –  Mark Ransom Commented Dec 11, 2017 at 4:58
  • I feel both ruby and scala ( v different languages) get it right significantly moreso than python: but in any case here is not the place.. –  WestCoastProjects Commented Dec 11, 2017 at 6:11
  • @MarkRansom "And Guido wouldn't have it any other way" - and yet, 10 years later we have Assignment Expressions ... –  Tomerikoo Commented Apr 10, 2021 at 14:48

You can define a function to do the assigning for you:

Willem Hengeveld's user avatar

One of the reasons why assignments are illegal in conditions is that it's easier to make a mistake and assign True or False:

In Python 3 True and False are keywords, so no risk anymore.

user2979916's user avatar

  • 1 In [161]: l_empty==[] Out[161]: True In [162]: []==[] Out[162]: True I do not think that is the reason –  volcano Commented Jan 2, 2014 at 13:23
  • Pretty sure most people put == True on the right side anyway. –  numbermaniac Commented Mar 11, 2019 at 7:27

The assignment operator - also known informally as the the walrus operator - was created at 28-Feb-2018 in PEP572 .

For the sake of completeness, I'll post the relevant parts so you can compare the differences between 3.7 and 3.8:

BPL's user avatar

Your Answer

Reminder: Answers generated by artificial intelligence tools are not allowed on Stack Overflow. Learn more

Sign up or log in

Post as a guest.

Required, but never shown

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy .

Not the answer you're looking for? Browse other questions tagged python or ask your own question .

  • The Overflow Blog
  • LLMs evolve quickly. Their underlying architecture, not so much.
  • From PHP to JavaScript to Kubernetes: how one backend engineer evolved over time
  • Featured on Meta
  • We've made changes to our Terms of Service & Privacy Policy - July 2024
  • Bringing clarity to status tag usage on meta sites
  • What does a new user need in a homepage experience on Stack Overflow?
  • Feedback requested: How do you use tag hover descriptions for curating and do...
  • Staging Ground Reviewer Motivation

Hot Network Questions

  • My PC takes so long to boot for no reason
  • How do you hide an investigation of alien ruins on the moon during Apollo 11?
  • using a tikz foreach loop inside a newcommand
  • What was I thinking when I made this grid?
  • Electric skateboard: helmet replacement
  • Why did General Leslie Groves evade Robert Oppenheimer's question here?
  • Seth and Cain take turns picking numbers from 1 to 50. Who wins?
  • What's the proper way to shut down after a kernel panic?
  • How can I draw water level in a cylinder like this?
  • Idiomatic alternative to “going to Canossa”
  • Will this be the first time that there are more people aboad the ISS than seats in docked spacecraft?
  • Flight left while checked in passenger queued for boarding
  • Does a MySQL replication slave need to be as powerful as the master?
  • Purpose of burn permit?
  • Are there different conventions for 'rounding to even'?
  • What's the origin of the colloquial "peachy", "simply peachy", and "just peachy"?
  • Meaning of “ ’thwart” in a 19th century poem
  • Why does a 240V dryer heating element have 3+1 terminals?
  • Lucas number multiples of Fibonacci pairs
  • Reduce String Length With Thread Safety & Concurrency
  • Why doesn't the world fill with time travelers?
  • The last person in Daniel 11. Who is it?
  • What is the name of the book about a boy dressed in layers of clothes that isn't a boy?
  • Is 2'6" within the size constraints of small, and what would the weight of a fairy that size be?

assignment define python

IMAGES

  1. Assignment Operator in Python

    assignment define python

  2. Python For Beginners

    assignment define python

  3. Python Assignment Writing

    assignment define python

  4. PPT

    assignment define python

  5. Assignment Operators in Python

    assignment define python

  6. What are Python Assignment Expressions and Using the Walrus Operator

    assignment define python

COMMENTS

  1. Python's Assignment Operator: Write Robust Assignments

    Python's assignment operators allow you to define assignment statements. This type of statement lets you create, initialize, and update variables throughout your code. Variables are a fundamental cornerstone in every piece of code, and assignment statements give you complete control over variable creation and mutation.

  2. 7. Simple statements

    An assignment statement evaluates the expression list (remember that this can be a single expression or a comma-separated list, the latter yielding a tuple) and assigns the single resulting object to each of the target lists, from left to right. Assignment is defined recursively depending on the form of the target (list).

  3. Different Forms of Assignment Statements in Python

    Multiple- target assignment: x = y = 75. print(x, y) In this form, Python assigns a reference to the same object (the object which is rightmost) to all the target on the left. OUTPUT. 75 75. 7. Augmented assignment : The augmented assignment is a shorthand assignment that combines an expression and an assignment.

  4. Assignment Operators in Python

    The Walrus Operator in Python is a new assignment operator which is introduced in Python version 3.8 and higher. This operator is used to assign a value to a variable within an expression. Syntax: a := expression. Example: In this code, we have a Python list of integers. We have used Python Walrus assignment operator within the Python while loop.

  5. Python Assignment Operators

    Python Assignment Operators. Assignment operators are used to assign values to variables: Operator. Example. Same As. Try it. =. x = 5. x = 5.

  6. How To Use Assignment Expressions in Python

    Python 3.8, released in October 2019, adds assignment expressions to Python via the := syntax. The assignment expression syntax is also sometimes called "the walrus operator" because := vaguely resembles a walrus with tusks. Assignment expressions allow variable assignments to occur inside of larger expressions.

  7. python

    Allowing this form of assignment within comprehensions, such as list comprehensions, and lambda functions where traditional assignments are forbidden. This can also facilitate interactive debugging without the need for code refactoring. Recommended use-case examples a) Getting conditional values. for example (in Python 3):

  8. Python

    Python Assignment Operator. The = (equal to) symbol is defined as assignment operator in Python. The value of Python expression on its right is assigned to a single variable on its left. The = symbol as in programming in general (and Python in particular) should not be confused with its usage in Mathematics, where it states that the expressions on the either side of the symbol are equal.

  9. Python Assignment Operators

    Login to Save. Assignment operators in Python. The above code is useful when we want to update the same number. We can also use two different numbers and use the assignment operators to apply them on two different values. num_one = 6. num_two = 3. print(num_one) num_one += num_two. print(num_one)

  10. Python Operators: Arithmetic, Assignment, Comparison, Logical, Identity

    Python Operators: Arithmetic, Assignment, Comparison, Logical, Identity, Membership, Bitwise. Operators are special symbols that perform some operation on operands and returns the result. For example, 5 + 6 is an expression where + is an operator that performs arithmetic add operation on numeric left operand 5 and the right side operand 6 and ...

  11. Assignment Operator in Python

    The simple assignment operator is the most commonly used operator in Python. It is used to assign a value to a variable. The syntax for the simple assignment operator is: variable = value. Here, the value on the right-hand side of the equals sign is assigned to the variable on the left-hand side. For example.

  12. Variables in Python

    In Python, variables need not be declared or defined in advance, as is the case in many other programming languages. To create a variable, you just assign it a value and then start using it. Assignment is done with a single equals sign ( = ): Python. >>> n = 300. This is read or interpreted as " n is assigned the value 300 .".

  13. Assignment Statement in Python

    Learn the basics of assignment statements in Python in this tutorial. We'll cover the syntax and usage of the assignment operator, including multiple assignm...

  14. PEP 572

    Unparenthesized assignment expressions are prohibited for the value of a keyword argument in a call. Example: foo(x = y := f(x)) # INVALID foo(x=(y := f(x))) # Valid, though probably confusing. This rule is included to disallow excessively confusing code, and because parsing keyword arguments is complex enough already.

  15. Multiple assignment in Python: Assign multiple values or the same value

    None in Python; Create calendar as text, HTML, list in Python; NumPy: Insert elements, rows, and columns into an array with np.insert() Shuffle a list, string, tuple in Python (random.shuffle, sample) Add and update an item in a dictionary in Python; Cartesian product of lists in Python (itertools.product) Remove a substring from a string in Python

  16. Python Operators (With Examples)

    6. Python Special operators. Python language offers some special types of operators like the identity operator and the membership operator. They are described below with examples. Identity operators. In Python, is and is not are used to check if two values are located at the same memory location.

  17. Python Conditional Assignment (in 3 Ways)

    Let's see a code snippet to understand it better. a = 10. b = 20 # assigning value to variable c based on condition. c = a if a > b else b. print(c) # output: 20. You can see we have conditionally assigned a value to variable c based on the condition a > b. 2. Using if-else statement.

  18. Python Dictionary (With Examples)

    Python Dictionary. Notes: Dictionary keys must be immutable, such as tuples, strings, integers, etc. We cannot use mutable (changeable) objects such as lists as keys. We can also create a dictionary using a Python built-in function dict(). To learn more, visit Python dict().

  19. Assign variable in while loop condition in Python?

    Starting Python 3.8, and the introduction of assignment expressions (PEP 572) ( := operator), it's now possible to capture the condition value ( data.readline()) of the while loop as a variable ( line) in order to re-use it within the body of the loop: while line := data.readline(): do_smthg(line)

  20. Defining a symbolic syntax for referring to assignment targets

    Exactly what that means depends on the nature of the assignment target (more details on that below). When the assignment target is an identifier, @'' or @"" can be used to mean "the assignment target as a string" (useful for APIs like collections.NamedTuple and typing.NewType

  21. Google Colab

    In this practical, we will learn about the programming language Python as well as NumPy and Matplotlib, two fundamental tools for data science and machine learning in Python. ... are the strict less and greater than operators, while == is the equality operator (not to be confused with =, the variable assignment operator). The operators ...

  22. dataclasses

    Module-level decorators, classes, and functions¶ @dataclasses.dataclass (*, init=True, repr=True, eq=True, order=False, unsafe_hash=False, frozen=False) ¶ This function is a decorator that is used to add generated special method s to classes, as described below.. The dataclass() decorator examines the class to find field s. A field is defined as class variable that has a type annotation.

  23. Is it possible only to declare a variable without assigning any value

    var = None. Python is dynamic, so you don't need to declare things; they exist automatically in the first scope where they're assigned. So, all you need is a regular old assignment statement as above. This is nice, because you'll never end up with an uninitialized variable.

  24. Proteomic changes in Alzheimer disease associated with ...

    Lastly, half of our module assignment showed consistency with previous CSF proteomic modules, particularly those related to lysosomes, the complement cascade and axonal development (details in ...

  25. python

    PEP 572 seeks to add assignment expressions (or "inline assignments") to the language, but it has seen a prolonged discussion over multiple huge threads on the python-dev mailing list—even after multiple rounds on python-ideas. Those threads were often contentious and were clearly voluminous to the point where many probably just tuned them out.