diff --git "a/19E3T4oBgHgl3EQfngq-/content/tmp_files/load_file.txt" "b/19E3T4oBgHgl3EQfngq-/content/tmp_files/load_file.txt" new file mode 100644--- /dev/null +++ "b/19E3T4oBgHgl3EQfngq-/content/tmp_files/load_file.txt" @@ -0,0 +1,695 @@ +filepath=/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf,len=694 +page_content='Deep Axial Hypercomplex Networks Nazmul Shahadat, Anthony S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Maida University of Louisiana at Lafayette Lafayette LA 70504, USA nazmul.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='ruet@gmail.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='com, maida@louisiana.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='edu Abstract Over the past decade, deep hypercomplex-inspired networks have enhanced feature extraction for image classification by enabling weight sharing across input channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Recent works make it possible to improve representational capabilities by using hypercomplex- inspired networks which consume high computational costs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This paper reduces this cost by factorizing a quaternion 2D convolutional module into two con- secutive vectormap 1D convolutional modules.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Also, we use 5D parameterized hypercomplex multiplication based fully connected layers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Incorporating both yields our proposed hypercomplex network, a novel architec- ture that can be assembled to construct deep axial- hypercomplex networks (DANs) for image classifica- tions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We conduct experiments on CIFAR benchmarks, SVHN, and Tiny ImageNet datasets and achieve bet- ter performance with fewer trainable parameters and FLOPS.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our proposed model achieves almost 2% higher performance for CIFAR and SVHN datasets, and more than 3% for the ImageNet-Tiny dataset and takes six times fewer parameters than the real-valued ResNets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Also, it shows state-of-the-art performance on CIFAR benchmarks in hypercomplex space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Introduction Convolutional neural networks (CNNs) and hyper- complex CNNs (HCNNs) for image classification form a hierarchical design where different layers extract dif- ferent levels of feature representation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' CNNs have shown significant success in recent decades [2, 9].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In vision tasks, these CNN-based feature extraction de- signs can be improved in regard to working with multi- dimensional data.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' To enhance the CNNs ability, HC- NNs have been used which treat the multi-dimensional data as a cohesive entity by applying cross-channel weight sharing to discover cross-channel relationships [4, 5, 14, 15].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Also, implementations in hypercomplex space provide more advantages [1, 3, 7, 13].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' It has also been shown that the HCNNs could create better output representations [13,16,17].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Recently, HCNNs with various dimensions like 2D HCNNs [21], 4D HCNNs [4, 14, 15], 8D HCNNs [20], or generalized HCNNs [5], have been studied and have hypercomplex properties.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The reason behind the suc- cess of HCNNs is that they capture the cross-channel relationships [4,5,14,15,17].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Among them, quaternion networks have a set of algebra operations and they have outperformed than the other HCNNs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Stacking quater- nion convolutional coherent layers have achieved better representational feature maps and have shown promis- ing results in vision tasks [4,15,17].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These networks are cost-effective compared to real-valued CNNs and fully connected networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' But still, they are very expensive for large inputs like vision tasks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This work uses an axial hypercomplex network that: 1) handles multidimensional inputs;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2) applies weight sharing across input channels;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3) captures cross-channel correlations;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 4) reduces computational costs;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' and 5) in- creases validation accuracy performance for image clas- sification datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The main idea of this work is to decompose hypercomplex 2D convolutional operation into two consecutive vectormap 1D convolutional oper- ations .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' By splitting 2D spatial convolution operation into height-axis and width-axis spatial convolution, it enables the model to reduce cost once again.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Addition- ally, we apply a quaternion-based stem layer, and pa- rameterized hypercomplex multiplication (PHM) based fully connected layer to get better representation and better generalization performance.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This paper conducts extensive experiments that show the effectiveness of our novel axial hypercomplex net- works on four image classification datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our novel contribution is a new model that factorizes the two- dimensional spatial hypercomplex convolutional oper- ation into two one-dimensional operations along the height-axis and width-axis sequentially.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our contribu- tions are: Replacing the spatial 3 × 3 QCNN in the bottle- arXiv:2301.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04626v1 [cs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='CV] 11 Jan 2023 Figure 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Proposed axial-hypercomplex network with PHM-based fully-connected layer in backend.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' “AHNN” stands for axial- hypercomplex neural network bottleneck block which is described in Figure 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Here, Qin = Qr + Qw + Qx + Qy + Qz, H = Hr + Hw + Hx + Hy + Hz, and Qout = Qro + Qwo + Qxo + Qyo + Qzo are the input, hypercomplex parameterized weight, and output, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For the calculation of H see the “PHM Layer” section.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' neck block of quaternion ResNets using two VC- NNs and showing the effectiveness of the proposed networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Applying QCNN in the stem layer (the first layer of the network), resulting in a quaternion-stem model.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Like QPHM [16], applying PHM-based dense layer in the backend of the network.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This proposed axial hypercomplex ResNets outper- formed the baseline networks for classification datasets which is shown in Tables 2, 3, and 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our experiments show that the proposed model achieves state-of-the-art results with far fewer trainable parameters, and FLOPS for CIFAR benchmarks in hypercomplex space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Background and Related Work 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Quaternion Convolution The deep quaternion CNN extends of complex CNNs [18].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This section explains cross channel weight sharing.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' [4] and [15] extended the principles of quaternion con- volution operations, and weight initialization.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Quater- nion number system is formed as, Q = r + ix + jy + kz ;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' r, x, y, z ∈ R where, r, x, y, and z are real values and i, j, and k are imaginary.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Quaternion convolution between quaternion filter matrix F and quaternion input vector M, is defined as [4]: M ⊛ F = (Or, Oi, Oj, Ok) = (Mr ∗ Fr − Mi ∗ Fi − Mj ∗ Fj − Mk ∗ Fk, Mi ∗ Fr + Mr ∗ Fi + Mj ∗ Fk − Mk ∗ Fj, Mj ∗ Fr + Mr ∗ Fj + Mk ∗ Fi − Mi ∗ Fk, Mk ∗ Fr + Mr ∗ Fk + Mi ∗ Fj − Mj ∗ Fi) (1) where, M ⊛ F, and all others are quaternion numbers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Or is the real part, and Oi, Oj, and Ok are the imag- inary parts.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Although there are 16 real-valued convo- lutions in Equation 1, there are only four kernels that are reused.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The weight sharing happens this way [14] which forces the model to learn cross-channel interre- lationships.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' According to the quaternion definition, a quaternion layer can accept four or m numbers of input channels, where m is divisible by four.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' To process m input channels (m ≥ 4), m/4 number of independent quaternion convolution modules is required.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Also, there are m/4 weight sets where each module has its own weight sets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Cross-channel weight sharing allows dis- covering of cross-channel input correlations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our weight initialization was the same as [4].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Vectormap Convolution We explain 3D generalized hypercomplex networks or VCNNs as VCNNs are used in our proposed mod- els.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The VCNN is more flexible as it doesn’t require 4D.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Parameterized STAGE 1 STAGE 2 STAGE 3 STAGE 4 Weight Input Horse Cat 224x224 Classification PHM Stem Layer based FC Layer QCNN 64 Filters 1st Layer 2nd Layer 3rd Layer 4rth Layer AHNN AHNN AHNN Flatten Filter size 7 AHNN 128 Filters 256 Filters 512 Filters with stride 2 64 Filters Layer max-pooling Stride 2 Stride 2 Stride 2 Stride 1 5-dimensional PHM laye VPHM TimerFigure 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' AHNN bottleneck block used in our proposed axial-hypercomplex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' “bn”, “quat”, and “VCNN” stand for batch normalization, quaternion CNN, and vectormap CNN, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' However, still using cross channel weight sharing this is seen in 3 × 3 matrix used in Equation 2, only three fil- ters A, B, and C are used.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The Vectormap convolution operation is defined as: � � R(M ∗ F) I(M ∗ F) J (M ∗ F) � � = L ⊙ � � A B C C A B B C A � � ∗ � � x y z � � (2) where, A, B, and C are real-valued kernels, x, y, and z being real-valued vectors, and L is a learnable matrix, L ∈ RD3×D3;' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' where D3 stands for 3-dimensional input channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The initial value of this matrix L is defined as: L = � � 1 1 1 1 1 1 1 1 1 � � (3) Our weight initialization follows [5].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' PHM Layer Parameterized hypercomplex multiplication is an- other form of generalized hypercomplex network, ex- plained in [22].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' As we use this PHM layer only in the fully connected (FC) layer, our explanation is re- stricted to this PHM-based dense layer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' It is defined as, y = Hx + b, where H ∈ Rk×d represents the PHM layer and it is calculated as, H = �n i=1 Ii ⊗ Ai, where Ii ∈ Rn×n and Ai ∈ Rk/n×d/n are learnable parameter matrices and i = 1 .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' n (n = 4 or 5).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These matrices can be reused which leads to parameter reduction.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Also, the ⊗ represents the Kronecker product.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The flattened layer which is the output of the CNN network is used as an input to the PHM FC layer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These inputs are split as, Qin = Qr + Qw + Qx + Qy + Qz and the outputs are merged into Qout as, Qout = Qro+Qwo+Qxo+Qyo+ Qzo for 5D hypercomplex.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The 4D hypercomplex pa- rameter matrix is discussed in [22] which expresses the Hamiltonian product, and the 5D hypercomplex param- eter matrix of PHM operation is explained in [16].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This 5D parameter matrix is used to construct a 5D PHM FC layer which preserves all properties of the PHM layer and hypercomplex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This work uses 5D PHM layer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Proposed Axial Hypercomplex Networks Complex convolutional neural networks (CCNNs), QCNNs, Octonions convolutional neural networks (OC- NNs), VCNNs, and PHM are the versions of HCNNs that provide all advantages of HCNNs like weight shar- ing across input channels, and the ability to discover cross channel correlations.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These HCNNs perform bet- ter with fewer trainable parameters for vision applica- tions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' But, they are still computationally expensive.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For vision tasks, these HCNNs take O(N 2) resources for an image of length N where N is the flattened pixel set.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For a 2D image of height h and width w, where N = hw, and h = w, the computational cost is O((hw)2) = O(h2w2) = O(h4).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This section describes our proposed axial- hypercomplex model in Figures 1 and 2 to reduce the computational cost.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Axial networks were first used in [8, 19].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' To implement our proposed model, we followed the assumption that images are approximately square where the pixel count of h and w are the same, and both are much less than the pixel count of hw [19].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' To translate a quaternion convolutional bottleneck block to an axial-hypercomplex bottleneck block, we replace the 3 × 3 spatial quaternion convolutional operation by two axial vectormap convolutional neural network (VCNN) layers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These layers are applied to the height m .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='-.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' b rel bn 3x1 VCNN Width-Axis enb nb relu 1 X X X S H .' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' S channel s sLayer Output size Deep Quaternion ResNet Vectormap ResNet QPHM Axial Hypercomplex Stem 32x32 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' std=1 3x3V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' std=1 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' std=1 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' std=1 Bottleneck group 1 32x32 � � 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 � �×3 � � 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 3x3V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 � �×3 � � 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 3x3QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 � �×3 � ��� 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 3x1AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 1x3AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 120 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 � ���×3 Bottleneck group 2 16x16 � � 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 � �×4 � � 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 3x3V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 � �×4 � � 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 3x3QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 � �×4 � ��� 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 3x1AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 1x3AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 240 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 � ���×4 Bottleneck group 3 8x8 � � 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1920 � �×6 � � 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 3x3V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1920 � �×6 � � 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 3x3QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1920 � �×6 � ��� 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 3x1AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 1x3AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 480 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1920 � ���×6 Bottleneck group 4 4x4 � � 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 3x3Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3840 � �×3 � � 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 3x3V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 1x1V,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3840 � �×3 � � 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 3x3QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 1x1QP,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3840 � �×3 � ��� 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 3x1AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 1x3AV,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 960 1x1Q,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3840 � ���×3 Pooling layer 1x1x100 global average-pool,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 100 outputs Output 1x1x100 fully connected layer,' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' softmax 5D PHM layer Table 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The 50-layer architectures tested on CIFAR-100: quaternion ResNet [4, 5], vectormap ResNet [5], QPHM [16], and our proposed axial-hypercomplex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Input is a 32x32x3 color image for CIFAR benchmarks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The number of stacked bottleneck modules is specified by multipliers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' “Q”, “V”, “QP”, “AV” and “std” denote quaternion convolution, 3D vectormap convolution, QPHM (quaternion networks with 4D PHM layer), axial vectormap convolution, and stride correspondingly.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Integers (e.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='g.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=', 120, 240) denote the number of output channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' PHM layer stands for parameterized hypercomplex multiplication layer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This work uses 5D PHM based FC layer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' axis (3 channels 3x1 VCNN layer) and width axis (3 channels 1x3 VCNN layer) sequentially.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The two 1 × 1 quaternion convolutional layers remain unchanged like the original QCNNs [4].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The 1 × 1 QCNNs are responsible to reduce and then increase the number of channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This forms our proposed axial-hypercomplex bottleneck block seen in Figure 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This block is stacked multiple times to construct axial-hypercomplex ResNets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Axial-hypercomplex models only work on one di- mension at a time but the input images are 2- dimensional.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For two-dimensional vision tasks, a square 2D input where h = w, so w2 = N, where N is the se- quence length of the flattened pixel set, is split into two 1D vectors.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The 3-channel VCNN operation is first ap- plied along the 1D input image region of length h and then applied along the 1D input image region of length w.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These two 1D operations finally merged together re- duces cost to O(h · h2) = O(h3) from the HCNNs cost of O(h4).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Each quaternion convolution accepts four channels of input and produces four channels of output.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Hence, the required number of 1 × 1 quaternion conv2d mod- ules equals the number of input channels divided by four.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The set of output channels of down-sampled 1 × 1 quaternion is merged into input to the axial VCNN mod- ules, and the output channels of axial VCNN modules are split into groups of four again for 1 × 1 up-sampled quaternion conv2d layer [4,17].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' One quaternion 2D con- volution is applied to each group of four channels and one vectormap 2D convolution is applied to each group of three channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Like vectormap, each axial vectormap module takes three input channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Thus, the weight- sharing is compartmentalized into groups of four input channels and then groups of three input channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For better representation, a quaternion convolution layer is also used in the stem layer (first layer of the net- work) as a quaternion-based frontend layer and the fully- connected dense layer as a PHM-based backend layer of deep axial-hypercomplex networks (DANs).' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Figure 1 illustrates our proposed axial-hypercomplex network ar- chitecture.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Experiment We conduct an extensive experiment on four classi- fication datasets to analyze the effectiveness of our pro- posed axial-hypercomplex model.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' As QCNNs, VCNNs, residual networks (ResNets), QPHM [16], and VPHM [16] all are performed 2D spatial convolution operation, Model Name Layers Dataset Params FLOPS Latency Validation Accuracy ResNet [6] 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='9M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='86ms 94.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='68 ResNet-with-QPHM [16] 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='55G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='32 Quaternion [4] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65ms 94.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='89 Vectormap [5] 26 CIFAR10 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65ms 94.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='76 QPHM [16] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='10G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='26 VPHM [16] 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='67ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='15 Axial-Hypercomplex 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='68ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='91-95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='85 ResNet [6] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08ms 94.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='95 ResNet-with-QPHM [16] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='81ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='80 Quaternion [4] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='47G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='33 Vectormap [5] 35 CIFAR10 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='45G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06 QPHM [16] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='46G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='79ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='55 VPHM [16] 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='44G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='60 Axial-Hypercomplex 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='36G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='49-96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='45 ResNet [6] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='32ms 94.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08 ResNet-with-QPHM [16] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='81ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='86 Quaternion [4] 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='42 Vectormap [5] 50 CIFAR10 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='13ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='37 QPHM [16] 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='92G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75 VPHM [16] 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='92G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='76 Axial-Hypercomplex 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='79-96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='71 ResNet [6] 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='89ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='21 ResNet-with-QPHM [16] 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='9M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='14 Quaternion [4] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='15G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 77.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65 Vectormap [5] 26 CIFAR100 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='15G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 77.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65 QPHM [16] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='15 VPHM [16] 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='66ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='14 Axial-Hypercomplex 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='69ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='42-79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='24 ResNet [6] 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='1M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='07ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='72 ResNet-with-QPHM [16] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='81ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='65 Quaternion [4] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='51G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='81ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='96 Vectormap [5] 35 CIFAR100 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='48G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='52 QPHM [16] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='47G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='46 VPHM [16] 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='45G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='86 Axial-Hypercomplex 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='36G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='85ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93-79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='63 ResNet [6] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='9M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='36ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='95 ResNet-with-QPHM [16] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='89 Quaternion [4] 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='96G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='17 Vectormap [5] 50 CIFAR100 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='13ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='39 QPHM [16] 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='05ms 78.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='22 VPHM [16] 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='92G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08ms 79.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='49 Axial-Hypercomplex 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09ms 80.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='81-80.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75 Table 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Image classification performance on the CIFAR benchmarks for 26, 35, and 50-layer architectures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Here, QPHM, and VPHM define the quaternion networks with PHM FC layer, and vectormap networks with the PHM FC layer, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' therefore we compare our proposed axial hypercomplex networks performance with the above-mentioned base- line models.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Among them, all models perform Hamilto- nian products like our proposed model except ResNets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Method We conducted our experiments by using five- dimensional PHM dense layer in the backend of the network, quaternion network at the beginning of the Model Name Layers Params FLOPS Latency Validation Accuracy ResNet [6] 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='9M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04 ResNet-with-QPHM [16] 40.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='62ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64 Quaternion [4] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='66ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='88 Vectormap [5] 26 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='10G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='66ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93 QPHM [16] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='10G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='62ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='97 VPHM [16] 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='64ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='24 Axial-Hypercomplex 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='69ms 97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='21-97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='05 ResNet [6] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='98ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='74 ResNet-with-QPHM [16] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='79ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='22 Quaternion [4] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='47G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='95 Vectormap [5] 35 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='45G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='97 QPHM [16] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='45G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='99 VPHM [16] 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='44G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='82ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='34 Axial-Hypercomplex 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='36G 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='85ms 97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='25-96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='90 ResNet [6] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='19ms 95.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='76 ResNet-with-QPHM [16] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='78 Quaternion [4] 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='94G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='24 Vectormap [5] 50 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='39 QPHM [16] 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='46 VPHM [16] 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='92G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09ms 96.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='49 Axial-Hypercomplex 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75G 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11ms 97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='47-97.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='25 Table 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Image classification performance on the SVHN benchmarks for 26, 35, and 50-layer architectures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Here, QPHM, and VPHM define the quaternion networks with PHM FC layer, and vectormap networks with PHM FC layer, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Model Name Layers Params FLOPS Latency Validation Accuracy ResNet [6] 41.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06ms 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='21 ResNet-with-QPHM [16] 41M 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31ms 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84 Quaternion [4] 11.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='02M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='54G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='48ms 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='84 Vectormap [5] 26 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='4M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='88ms 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='15 QPHM [16] 10.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='4M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31ms 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='02 VPHM [16] 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='44G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='27ms 53.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='11 Axial-Hypercomplex 6.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='49ms 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='56-58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06 ResNet [6] 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='5M 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='21ms 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='80 ResNet-with-QPHM [16] 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='9M 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='31G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='85ms 59 Quaternion [4] 15.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='98G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='52ms 54.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='53 Vectormap [5] 35 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='07M 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='98G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='76ms 55.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='99 QPHM [16] 14.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='47G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='88ms 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='42 VPHM [16] 19.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='4M 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='88G 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08ms 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='10 Axial-Hypercomplex 9.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='36G 2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='97ms 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06-59.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='87 ResNet [6] 83.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2M 18.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='77ms 59.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='06 ResNet-with-QPHM [16] 82.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='6M 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='57G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='66ms 60.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='30 Quaternion [4] 21.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='4M 7.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='87G 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='14ms 56.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='63 Vectormap [5] 50 28.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='3M 7.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='87G 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='34ms 57.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='52 QPHM [16] 20.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='8M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='88ms 59.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='42 VPHM [16] 27.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 7.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75G 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='51ms 58.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='96 Axial-Hypercomplex 13.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='7M 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='75G 3.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='93ms 62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='73-62.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='07 Table 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Image classification performance on the Tiny ImageNet benchmarks for 26, 35, and 50-layer architectures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Here, QPHM, and VPHM define the quaternion networks with PHM FC layer, and vectormap networks with PHM FC layer, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' network, and axial-hypercomplex residual bottleneck block on CIFAR benchmark datasets [10], Street View House Numbers (SVHN) [12], and Tiny ImageNet [11] datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The models we tested to compare with our proposed model, are: the standard DCNNs [6], the DQNNs [4], the axial-ResNet with QPHM [16], QPHM [16], VPHM [16], and our proposed method.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' CIFAR-10 and CIFAR- 100 datasets consist of 60,000 color images of size 32 × 32 pixels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These datasets fall into 10 and 100 dis- tinct classes and are split into a training set with 50,000 images and a test set with 10,000 images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We perform standard data augmentation schemes for these datasets like [4–6,16].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Both datasets were normalized using per- channel mean and standard deviation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We perform hori- zontal flips and take random crops from images padded by 4 pixels on each side to obtain a 40 × 40 pixel image, then a 32 × 32 crop is randomly extracted.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' SVHN contains about 600,000 digit images [12].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' For experiments on SVHN we don’t do any image pre- processing, except simple mean/std normalization.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We use similar augmentation for the Tiny ImageNet dataset which contains 100,000 training images of 200 classes (500 for each class) downsized to 64×64 colored images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The test set contains 10,000 images [11].' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' All baseline models were trained using the same components as the real-valued networks, the original quaternion network, the original vectormap network, the QPHM, and the VPHM networks using the same datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' All models in Table 2 were trained using the same hyperparameters and the same number of out- put channels.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The 50-layer architectural details of the above-mentioned models are depicted in Table 1 for the CIFAR-100 dataset.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Due to space limitation, the deep ResNets and VPHM network architectures are not de- picted in the architecture Table 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In the stem layer, the 3 × 3 convolution network is used for deep ResNets [6], 3 × 3 quaternion net- work is used for the deep quaternion ResNets [4, 18], for the QPHM [16], and axial-hypercomplex networks (our proposed method), and 3 × 3 vectormap network is used for the deep vectormap ResNets [5], and the VPHM [16] networks with stride 1 & 120 output filters.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We use parameterized hypercomplex multiplication (PHM) for the dense layer in the backend of deep ResNets, QPHM, VPHM, and our proposed axial-hypercomplex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In the bottleneck block, the number of out- put channels of bottleneck groups are 120, 240, 480, & 960 for all networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In this experiment, we analyze 26- layer, 35-layer, and 50-layer architectures with the bot- tleneck block multipliers “[1, 2, 4, 1]”, “[2, 3, 4, 2]”, and “[3, 4, 6, 3]”.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These are depicted in Table 1.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We ran all of the models using stochastic gradient de- cent optimizer.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We used linearly warmed-up learning from zero to 0.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='1 for the first 10 epochs and then used cosine learning rate scheduling from epochs 11 to 150.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' All models were trained for 128 batch sizes.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='2.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Results The overall results of all models (base models and our proposed networks) appear in Tables 2, 3, and 4.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The top half of Table 2 shows the results for the CI- FAR10 dataset and the bottom half presents the results for the CIFAR100.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Both datasets have been tested by the 26, 35, and 50 layers architectures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' These are the pa- rameter count, FLOPS count (number of multiply-add operations), inference time or Latency (time required to process a single image), and the percentage accu- racy of validation results for each model.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We evaluate original ResNets [6], ResNet with QPHM [16], orig- inal quaternion networks [4], original vectormap net- works [5], QPHM [16], and VPHM [16] with the same configuration like our proposed axial-hypercomplex net- works.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our proposed axial-hypercomplex networks per- form better in validation accuracy with lower param- eter count and FLOPS for CIFAR-10 and CIFAR-100 datasets than the baseline networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' More precisely, our proposed method takes almost 6 times, 1/3 times, 1/2 times, 1/3 times, and 1/2 times fewer parameters than the ResNets, quaternion networks, vectormap net- works, QPHM, and VPHM respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Moreover, axial-hypercomplex networks achieved state-of-the-art results for these CIFAR benchmarks in hypercomplex space.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The performances for SVHN and Tiny ImageNet datasets are shown in Tables 3 and 4 for all architec- tures.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The axial-hypercomplex network’s validation ac- curacies outperform the other base networks with fewer trainable parameters and FLOPS like CIFAR datasets.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' The result Tables 2, 3, and 4 show our proposed model performance ranges of three runs.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' However, the latency of axial-hypercomplex networks is a little bit higher in some cases than the quaternion-based networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This may be due to the use of vectormap networks along with quaternion networks as the latency for vectormap net- works is higher.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 5.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Discussion and Conclusions This paper proposes axial-hypercomplex convolu- tions to reduce the cost of 2D convolutional opera- tions and shows the effectiveness of image classifi- cation tasks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' We also applied 4D PHM in the net- work’s backend.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' On CIFAR benchmarks, our pro- posed Axial-hypercomplex network, formed by stacking axial-vectormap convolution (three-dimensional) in the quaternion bottleneck blocks, achieved state-of-the-art results among hypercomplex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our main conclusion is that using quaternion convo- lutions as the frontend stem layer, four/five-dimensional PHM-based densely connected backend layer, and axial- hypercomplex bottleneck block improves classification performance on the CIFAR benchmarks, SVHN, and Tiny ImageNet datasets in comparison to the other models we tested.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our proposed method factorizes a channel-wise 2D convolution (hypercomplex convolu- tion which works along the channels) to a column con- volution and a row convolution.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Extensive experiments show that this leads to systematic improvement with far fewer trainable parameters on image classification.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This proposed method can save 33%, and 50% trainable pa- rameters compared to original quaternion and vectormap networks and QPHM and VPHM networks, respectively.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Although our proposed axial-hypercomplex design reduced parameter counts and FLOPS, it exhibited higher latency than real-valued and hypercomplex- valued convolutional networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' This is because the model performs convolution twice (height-axis and width-axis) and it takes transition time from 2D convo- lution to two consecutive 1D convolutions.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' As we re- placed spatial quaternion (four-dimensional hypercom- plex network) 2D convolution using two axial vec- tormap (three-dimensional hypercomplex network) 1D convolutions, the number of output channels are re- stricted to 120 or a multiple of 120 which are divisible by three and four.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Our investigation concludes that the per- formance comparison between the hypercomplex net- works and our proposed axial-hypercomplex networks shows that the axial-hypercomplex convolution provides better validation performance with fewer trainable pa- rameters and FLOPS for image classification tasks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Further work may be directed toward the architecture of the axial quaternion network and axial vectormap net- work.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Moreover, other datasets will be tested to check whether these proposed architectures can perform in a similar manner or not.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Finally, axial-quaternion and axial-vectormap convolutional methods will help to re- move the number of output channels constrained as it will divisible by four for axial-quaternion networks and three for axial-vectormap networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' References [1] Martin Arjovsky, Amar Shah, and Yoshua Bengio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Uni- tary evolution recurrent neural networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In Interna- tional Conference on Machine Learning, pages 1120– 1128.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' PMLR, 2016.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [2] Pierre Buyssens, Abderrahim Elmoataz, and Olivier L´ezoray.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Multiscale convolutional neural networks for vision–based classification of cells.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In Asian Conference on Computer Vision, pages 342–352.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Springer, 2012.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [3] Ivo Danihelka, Greg Wayne, Benigno Uria, Nal Kalch- brenner, and Alex Graves.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Associative long short-term memory.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In International Conference on Machine Learn- ing, pages 1986–1994.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' PMLR, 2016.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [4] Chase J Gaudet and Anthony S Maida.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Deep quater- nion networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In 2018 International Joint Conference on Neural Networks (IJCNN), pages 1–8.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' IEEE, 2018.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 2, 4, 5, 6, 7 [5] Chase J Gaudet and Anthony S Maida.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Removing di- mensional restrictions on complex/hyper-complex neural networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In 2021 IEEE International Conference on Im- age Processing (ICIP), pages 319–323.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' IEEE, 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 3, 4, 5, 6, 7 [6] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Deep residual learning for image recognition.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 5, 6, 7 [7] Akira Hirose and Shotaro Yoshida.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Generalization char- acteristics of complex-valued feedforward neural net- works in relation to signal coherence.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' IEEE Transactions on Neural Networks and learning systems, 23(4):541– 551, 2012.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [8] Jonathan Ho, Nal Kalchbrenner, Dirk Weissenborn, and Tim Salimans.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Axial attention in multidimensional trans- formers.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' arXiv preprint arXiv:1912.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='12180, 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3 [9] Shima Javanmardi, Seyed-Hassan Miraei Ashtiani, Fons J Verbeek, and Alex Martynenko.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Computer-vision classification of corn seed varieties using deep convolu- tional neural network.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Journal of Stored Products Re- search, 92:101800, 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [10] Alex Krizhevsky, Geoffrey Hinton, et al.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Learning mul- tiple layers of features from tiny images.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2009.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 6 [11] Ya Le and Xuan S.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Yang.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Tiny imagenet visual recogni- tion challenge.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2015.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 6, 7 [12] Yuval Netzer, Tao Wang, Adam Coates, Alessandro Bis- sacco, Bo Wu, and Andrew Y Ng.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Reading digits in nat- ural images with unsupervised feature learning.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2011.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 6, 7 [13] Tohru Nitta.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' On the critical points of the complex- valued neural network.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In Proceedings of the 9th Inter- national Conference on Neural Information Processing, 2002.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' ICONIP’02.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=', volume 3, pages 1099–1103.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' IEEE, 2002.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [14] Titouan Parcollet, Mohamed Morchid, and Georges Linar`es.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Quaternion convolutional neural networks for heterogeneous image processing.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8514–8518.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' IEEE, 2019.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 2 [15] Titouan Parcollet, Mirco Ravanelli, Mohamed Morchid, Georges Linar`es, Chiheb Trabelsi, Renato De Mori, and Yoshua Bengio.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Quaternion recurrent neural networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' arXiv preprint arXiv:1806.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='04418, 2018.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 2 [16] Nazmul Shahadat and Anthony Maida.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Enhancing resnet image classification performance by using parameterized hypercomplex multiplication, Nov 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 2, 3, 4, 5, 6, 7 [17] Nazmul Shahadat and Anthony S Maida.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Adding quater- nion representations to attention networks for classifica- tion.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' arXiv preprint arXiv:2110.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='01185, 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1, 4 [18] Chiheb Trabelsi, Olexa Bilaniuk, Ying Zhang, Dmitriy Serdyuk, Sandeep Subramanian, Joao Felipe Santos, Soroush Mehri, Negar Rostamzadeh, Yoshua Bengio, and Christopher J Pal.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Deep complex networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' arXiv preprint arXiv:1705.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='09792, 2017.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 2, 7 [19] Huiyu Wang, Yukun Zhu, Bradley Green, Hartwig Adam, Alan Yuille, and Liang-Chieh Chen.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Axial- deeplab: Stand-alone axial-attention for panoptic seg- mentation.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' In European Conference on Computer Vision, pages 108–126.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Springer, 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3 [20] Jiasong Wu, Ling Xu, Fuzhi Wu, Youyong Kong, Lotfi Senhadji, and Huazhong Shu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Deep octonion networks.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Neurocomputing, 397:179–191, 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [21] Ruyue Xin, Jiang Zhang, and Yitong Shao.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Complex net- work classification with convolutional neural network.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Tsinghua Science and technology, 25(4):447–457, 2020.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 1 [22] Aston Zhang, Yi Tay, Shuai Zhang, Alvin Chan, Anh Tuan Luu, Siu Cheung Hui, and Jie Fu.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' Beyond fully-connected layers with quaternions: Parameteriza- tion of hypercomplex multiplications with 1/n parame- ters.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' arXiv preprint arXiv:2102.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content='08597, 2021.' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'} +page_content=' 3' metadata={'source': '/home/zjlab/wf/langchain-ChatGLM/knowledge_base/19E3T4oBgHgl3EQfngq-/content/2301.04626v1.pdf'}