Change Password

Please enter the password.
Please enter the password. Between 8-64 characters. Not identical to your email address. Contain at least 3 of: uppercase, lowercase, numbers, and special characters.
Please enter the password.
Submit

Change Nickname

Current Nickname:
Submit

Apply New License

License Detail

Please complete this required field.

  • Ultipa Graph V4

Standalone

Please complete this required field.

Please complete this required field.

The MAC address of the server you want to deploy.

Please complete this required field.

Please complete this required field.

Cancel
Apply
ID
Product
Status
Cores
Applied Validity Period(days)
Effective Date
Excpired Date
Mac Address
Apply Comment
Review Comment
Close
Profile
  • Full Name:
  • Phone:
  • Company:
  • Company Email:
  • Country:
  • Language:
Change Password
Apply

You have no license application record.

Apply
Certificate Issued at Valid until Serial No. File
Serial No. Valid until File

Not having one? Apply now! >>>

Product Created On ID Amount (USD) Invoice
Product Created On ID Amount (USD) Invoice

No Invoice

Search
    English

      Fast Random Projection

      ✓ File Writeback ✕ Property Writeback ✓ Direct Return ✓ Stream Return ✕ Stats

      Overview

      Fast Random Projection (or FastRP) is a scalable and performant algorithm for learning node representations in a graph. It achieves an iterative computation of node embeddings with two components: first, node similarity matrix construction; second, dimension reduction by random projection.

      FastRP was proposed by H. Chen et al. in 2019:

      FastRP Framework

      The authors of FastRP propose that most network embedding methods essentially consist of two components: (1) construct a node similarity matrix or implicitly sample node pairs from it, and then (2) apply dimension reduction techniques to this matrix.

      Regarding the dimension reduction, many popular algorithms, like Node2Vec, rely on time-consuming methods such as Skip-gram. The authors argue that the success of these methods should be attributed to the proper construction of this similarity matrix, rather than the dimension reduction method employed.

      Therefore, FastRP utilizes very sparse random projection, which is a scalable optimization-free method for dimension reduction.

      Very Sparse Random Projection

      Random projection is a dimension reduction method that roughly preserves the relative pairwise distances between data points when embedding them from the high-dimensional space into a lower-dimensional space. The theoretical guarantees of random projection are mainly derived from the Johnson-Lindenstrauss Lemma.

      The idea behind random projection is very simple: to reduce a matrix Mn×m (for graph data, we have n = m) to a matrix Nn×d where d ≪ n, we can simply multiply M with an random projection matrix Rm×d: N = M ⋅ R**.

      The matrix R is generated by sampling its entries independently from a random distribution. Specifically, FastRP considers the very sparse random projection method for dimension reduction, where entries of R are sampled from the following distribution:

      and s=sqrt(m).

      Very sparse random projection has superior computation efficiency as it only requires matrix multiplication, and (1-1/s) of the entries of R are zero.

      Additionally, Ultipa supports to initialize the matrix R with a combination with some selected numeric node properties (features). In this case, each row in matrix R is formed by concatenation of two parts:

      • Elements d1 ~ dx are generated by the Very Sparse Random Projection algorithm.
      • Elements p1 ~ py is a linear combination of the selected node property values.
      • x + y equals to the dimension of the final embeddings, where y is referred to as the property dimension.
      • The number of selected node features doesn't have to be the same as the property dimension.

      Node Similarity Matrix Construction

      FastRP keeps two key features when constructing node similarity matrix:

      1. Preserve high-order proximity in the graph
      2. Perform element-wise normalization on the matrix

      Let S be the adjacency matrix of graph G =(V,E), D be the degree matrix, and A be the transition matrix and A = D-1S. See the example below, the number on each edge represents the edge weight:

      The fact is that in matrix A, entry Aij stands for the probability of node i reaching to node j with 1-step random walk. Furthermore, if raises A to k-th power to get matrix Ak, then Akij denotes the probability of reaching node j from node i in k steps of random walk. Hence, matrix Ak preserves high-order proximity of the graph.

      However, as k → ∞, it can be proved that Akij → dj/2m where m = |E|, and dj is the degree of the j-th node. Since many real-world graphs are scale-free, the entries in Ak follow a heavy-tailed distribution. This implies that the pairwise distances between data points in Ak are predominantly influenced by columns with exceptionally large values, making them less meaningful and posing challenges for the next dimension reduction step.

      To address this, FastRP further performs normalization on matrix Ak by multiplying it with a diagonal normalization matrix L, where Lij = (dj/2m)β, and β is the normalization strength. The normalization strength controls the impact of node degree on the embeddings: when β is negative, the importance of high-degree neighbors is weakened, and the opposite when β is positive.

      Algorithm Process

      The FastRP algorithm follows these steps:

      1. Generate the random projection matrix R.
      2. Reduce the dimension of the normalized 1-step transition matrix: N1 = A ⋅ L ⋅ R
      3. Reduce the dimension of the normalized 2-step transition matrix: N2 = A ⋅ (A ⋅ L ⋅ R) = A ⋅ N1
      4. Repeat step 3 until the normalized k-step transition matrix: Nk = A ⋅ Nk-1
      5. Generate the weighted combination of different powers of A: N = α1N1 + ... + αkNk, where α1,...,αk are the iteration weights.

      The efficiency of FastRP also benefits from the associative property of matrix multiplication. It doesn't calculate Ak from scratch as the computation can be done iteratively.

      Considerations

      • The FastRP algorithm ignores the direction of edges but calculates them as undirected edges.

      Syntax

      • Command: algo(fastRP)
      • Parameters:
      Name

      Type
      Spec
      Default
      Optional
      Description
      dimension int ≥2 / No Dimensionality of the embeddings
      normalizationStrength float / / Yes Normalization strength β
      iterationWeights []float >0 [0.0,1.0,1.0] Yes Weight for each iteration, the size of the array is the number of iterations
      edge_schema_property []@<schema>?.<property> Numeric type, must LTE / Yes Edge property(-ies) to be used as edge weight(s), where the values of multiple properties are summed up
      node_schema_property []@<schema>?.<property> Numeric type, must LTE / Yes Node property(-ies) to be used as feature(s); propertyDimension and node_schema_property should be configured together or both ignored
      propertyDimension int (0,dimension] / Yes Length of the property dimension; propertyDimension and node_schema_property should be configured together or both ignored
      limit int ≥-1 -1 Yes Number of results to return, -1 to return all results

      Example

      File Writeback

      Spec Content
      filename _id, embedding_result
      algo(fastRP).params({
        dimension : 10,
        normalizationStrength: 1.5,
        iterationWeights: [0, 0.8, 0.2, 1],
        edge_schema_property: ['w1', 'w2'],
        node_schema_property: ['@default.f1', '@default.f2'],
        propertyDimension: 3
      }).write({
        file:{
          filename: 'embedding'
        }
      })
      

      Property Writeback

      Spec Content Write to Data Type
      property embedding_result Node Property string
      algo(fastRP).params({
        dimension : 10,
        normalizationStrength: 1.5,
        iterationWeights: [0, 0.8, 0.2, 1],
        edge_schema_property: ['w1', 'w2']
      }).write({
        db:{
          property: 'embedding'
        }
      })
      

      Direct Return

      Alias Ordinal
      Type
      Description
      Columns
      0 []perNode Node and its embeddings _uuid, embedding_result
      algo(fastRP).params({
        dimension : 10,
        normalizationStrength: 1.5,
        iterationWeights: [0, 0.8, 0.2, 1]
      }) as embeddings
      return embeddings
      

      Stream Return

      Alias Ordinal
      Type
      Description
      Columns
      0 []perNode Node and its embeddings _uuid, embedding_result
      algo(fastRP).params({
        dimension : 10,
        normalizationStrength: 1.5,
        iterationWeights: [0, 0.8, 0.2, 1]
      }).stream() as embeddings
      return embeddings
      
      Please complete the following information to download this book
      *
      公司名称不能为空
      *
      公司邮箱必须填写
      *
      你的名字必须填写
      *
      你的电话必须填写