chunk_id stringclasses 937
values | content stringlengths 401 2.02k | title stringlengths 8 162 | source_url stringlengths 32 32 | full_citation stringlengths 25 179 | embedding list |
|---|---|---|---|---|---|
0 | 4 2 0 2
n a J 7 1 ] S D . s c [
1 v 0 5 3 9 0 . 1 0 4 2 : v i X r a
Sebastian Bruch
# Foundations of Vector Retrieval
# Preface
We are witness to a few years of remarkable developments in Artificial Intelligence with the use of advanced machine learning algorithms, and in particular, deep learning. Gargantuan, complex ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07268679887056351,
-0.04013438895344734,
0.0665549635887146,
-0.007558630313724279,
0.08726479858160019,
0.007134184241294861,
-0.016330817714333534,
0.04230983927845955,
-0.020066967234015465,
-0.05376772582530975,
-0.08170247077941895,
0.04416847229003906,
-0.013041144236922264,
0.069... |
1 | These neural networks and their training algorithms may be complex, and the scope of their impact broad and wide, but nonetheless they are simply functions in a high-dimensional space. A trained neural network takes a vector as input, crunches and transforms it in various ways, and produces another vector, often in som... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05044270679354668,
-0.10408303886651993,
-0.018599553033709526,
-0.001484756707213819,
0.029128413647413254,
0.07625225931406021,
-0.012493920512497425,
-0.04724474251270294,
0.07122699916362762,
-0.043805383145809174,
-0.03631635755300522,
0.02384120225906372,
0.027283938601613045,
0.0... |
2 | If new and old knowledge can be squeezed into a collection of learnt or hand-crafted vectors, what useful things does that enable us to do? A metaphor that might help us think about that question is this: An ever- evolving database full of such vectors that capture various pieces of data can
v
# vi
be understood as a m... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.017140544950962067,
-0.0130759347230196,
-0.05749322846531868,
0.05339270085096359,
0.038260236382484436,
0.08604811131954193,
0.06534890830516815,
0.023251380771398544,
0.07344342768192291,
0.03495679423213005,
-0.048334866762161255,
0.07690949738025665,
-0.022055208683013916,
0.0366403... |
3 | Similarity is then a function of two vectors, quantifying how similar two vectors are. It may, for example, be based on the Euclidean distance between the query vector and a database vector, where similar vectors have a smaller distance. Or it may instead be based on the inner product between two vec- tors. Or their an... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0724412351846695,
-0.06799224019050598,
-0.07165085524320602,
-0.07258643209934235,
0.025008467957377434,
-0.0017764209769666195,
0.0006771626067347825,
0.025228800252079964,
0.07903013378381729,
-0.004211220890283585,
-0.005660379305481911,
0.062444258481264114,
0.07176495343446732,
0.... |
4 | A neural network that is trained to perform a general task such as question- answering, could conceivably augment its view of the world by ârecallingâ in- formation from such a database and finding answers to new questions. This is particularly useful for generative agents such as chatbots who would oth- erwise be ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0878816470503807,
-0.08819911628961563,
-0.04814438149333,
0.08560676872730255,
0.012590643018484116,
0.05988818407058716,
0.050426993519067764,
0.01359286904335022,
-0.011677290312945843,
-0.042357731610536575,
-0.0171517301350832,
-0.01214850414544344,
0.03822977468371391,
0.000920100... |
5 | For decades now, research on vector retrieval has sought to improve the efficiency of search over large vector databases. The resulting literature is rich with solutions ranging from heavily theoretical results to performant empir- ical heuristics. Many of the proposed algorithms have undergone rigorous benchmarking an... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.0011435792548581958,
-0.04224681854248047,
-0.02044663019478321,
0.01360204815864563,
0.0881994217634201,
-0.005925384815782309,
-0.019375862553715706,
0.01890428364276886,
-0.00009667828271631151,
-0.04198917746543884,
-0.06383731961250305,
0.05079732835292816,
0.028270173817873,
0.0166... |
6 | That gap is what this monograph intends to close. With the goal of present- ing the fundamentals of vector retrieval as a sub-discipline, this manuscript delves into important data structures and algorithms that have emerged in the literature to solve the vector retrieval problem efficiently and effectively.
vii
# viii... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0767616257071495,
0.023192699998617172,
-0.009158319793641567,
-0.026385288685560226,
0.08423947542905807,
0.026991089805960655,
0.028291137889027596,
-0.03700951859354973,
0.033017951995134354,
-0.04483771324157715,
-0.01922985166311264,
0.0635003075003624,
0.0236705020070076,
0.008152... |
7 | # Retrieval Algorithms
With that foundation in place and the question clearly formulated, the second part of the monograph explores the different classes of existing solutions in great depth. We close each chapter with a summary of algorithmic insights. There, we will also discuss what remains challenging and explore f... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.01934048905968666,
0.007810971699655056,
0.02654971368610859,
-0.035635314881801605,
0.023992644622921944,
0.013475391082465649,
-0.01901537925004959,
0.017338961362838745,
0.029975270852446556,
0.003449075622484088,
-0.05588671565055847,
0.051795151084661484,
0.051208604127168655,
0.04... |
8 | Alternatively, instead of laying a mesh over the space, we may define a fixed number of buckets and map data points to these buckets with the property that, if two data points are close to each other according to the distance func- tion, they are more likely to be mapped to the same bucket. When processing a query, we ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.005861202254891396,
0.03189714998006821,
-0.011032724753022194,
-0.03499558940529823,
0.03912322595715523,
-0.02096850611269474,
0.007999604567885399,
-0.003267928259447217,
0.029994944110512733,
0.04127097874879837,
0.011147712357342243,
0.0048712994903326035,
0.0499742366373539,
0.005... |
9 | As we examine vector retrieval algorithms, it is inevitable that we must ink in extra pages to discuss why similarity based on inner product is special and why it poses extra challenges for the algorithms in each categoryâmany of these difficulties will become clear in the introductory chapters.
There is, however, a ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.08407152444124222,
0.019758179783821106,
-0.019530054181814194,
-0.06837217509746552,
0.03934955969452858,
0.01738685369491577,
0.016391240060329437,
-0.04723554104566574,
0.08077958226203918,
-0.0476364865899086,
0.012794685550034046,
0.08186410367488861,
0.020679615437984467,
0.035887... |
10 | Related to the topic of compression is the concept of sketching. Sketching is a technique to project a high-dimensional vector into a low-dimensional vector, called a sketch, such that certain properties (e.g., the L2 norm, or inner products between any two vectors) are approximately preserved. This probabilistic metho... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.09454315900802612,
-0.001676070736721158,
0.009700935333967209,
-0.08383624255657196,
0.03878707438707352,
0.018878668546676636,
-0.0010091569274663925,
0.008501402102410793,
0.00027733080787584186,
-0.008083927445113659,
-0.033808883279561996,
0.05681822821497917,
0.01712546870112419,
... |
11 | # Intended Audience
This monograph is intended as an introductory text for graduate students who wish to embark on research on vector retrieval. It is also meant to serve as a self-contained reference that captures important developments in the field, and as such, may be useful to established researchers as well.
As th... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07027292996644974,
0.023448888212442398,
-0.010932100005447865,
-0.06164257973432541,
0.05671737715601921,
-0.02129141427576542,
0.0560152567923069,
-0.028493698686361313,
0.013797493651509285,
0.004478210583329201,
-0.022759635001420975,
0.048345014452934265,
0.1001489982008934,
-0.016... |
12 | # Acknowledgements
I am forever indebted to my dearest colleagues Edo Liberty, Amir Ingber, Brian Hentschel, and Aditya Krishnan. This incredible but humble group of scholars at Pinecone are generous with their time and knowledge, patiently teaching me what I do not know, and letting me use them as a sounding board wit... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.11067993938922882,
0.06806208193302155,
0.018555378541350365,
-0.00630111712962389,
-0.046445440500974655,
0.05518219247460365,
0.016531910747289658,
0.05692151561379433,
0.034301988780498505,
-0.013390007428824902,
-0.02656870149075985,
0.0074618482030928135,
0.037464823573827744,
0.03... |
13 | xi
# Notation
This section summarizes the special symbols and notation used throughout this work. We often repeat these definitions in context as a reminder, espe- cially if we choose to abuse notation for brevity or other reasons.
Paragraphs that are highlighted in a gray box such as this contain important statements,... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03657155483961105,
0.018741851672530174,
0.009188681840896606,
-0.018874796107411385,
0.02675699070096016,
0.038664061576128006,
0.12752722203731537,
0.05977669358253479,
0.11656473577022552,
-0.019476691260933876,
0.06543731689453125,
0.04399726912379265,
0.09867720305919647,
-0.066750... |
14 | # Sets
J |·| [n] B(u, r) \ â³ 1p Calligraphic font typically denotes sets. The cardinality (number of items) of a finite set. The {1, 2, 3, . . . , n}. The closed ball of radius r centered at point u: {v | δ(u, v) ⤠r} where δ(·, ·) is the distance function. The set difference operator: A \ B = {x â A | x /â... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.010328972712159157,
0.008733655326068401,
-0.04645359888672829,
-0.07388410717248917,
-0.05126047134399414,
0.030255787074565887,
0.12231573462486267,
0.003206134308129549,
0.02510855719447136,
-0.03241150081157684,
-0.01605810970067978,
-0.009437276050448418,
0.0840800479054451,
0.0055... |
15 | Z The set of integers.
R¢ d-dimensional Euclidean space.
g@-1 The hypersphere in R¢.
U,U,W Lowercase letters denote vectors.
Ui, Vi, Wi Subscripts identify a specific coordinate of a vector, so that u,; is the i-th coordinate of vector u.
# Functions and Operators
nz (·)
The set of non-zero coordinates of a vector: ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.058962006121873856,
-0.04009953513741493,
-0.006584824528545141,
-0.004400601610541344,
-0.026849055662751198,
-0.012930944561958313,
0.1289820522069931,
0.0060804751701653,
0.07629858702421188,
-0.037365954369306564,
0.04142698645591736,
0.018066292628645897,
0.07782228291034698,
0.023... |
17 | Intrinsic Dimensionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 3.1 High-Dimensional Data and Low-Dimensional Manifolds . . . . . 23 3.2 Doubling Measure and Expansion Rate . . . . . . . . . . . . . . . . . . . . 24 3.3 Doubling Dimension . . . . . . . . . . . . . . . . . . . . . . . . .... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.039242975413799286,
-0.0764140859246254,
-0.0025679050013422966,
-0.01787695847451687,
-0.07028414309024811,
-0.034435637295246124,
-0.03883352875709534,
0.03646828234195709,
0.00041376601438969374,
-0.023279301822185516,
0.03740517422556877,
0.000541874032933265,
0.01599751226603985,
0... |
18 | 1 Vector Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.1 Vector Representations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 Vectors as Units of Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 . . . . . . . . . . . . ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04582015424966812,
-0.0798853188753128,
-0.10993073880672455,
-0.11513903737068176,
-0.02676321007311344,
0.029108017683029175,
-0.05058930441737175,
0.07423441112041473,
-0.017106391489505768,
-0.051897693425416946,
-0.008314316160976887,
0.035506632179021835,
0.09054422378540039,
0.05... |
20 | 2 Retrieval Stability in High Dimensions . . . . . . . . . . . . . . . . . . . 17 2.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2.2 Formal Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.3 Empirical Demonstrati... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07251998782157898,
-0.030429109930992126,
-0.0639546811580658,
-0.04493219777941704,
0.004442824516445398,
-0.04709490388631821,
-0.02928094193339348,
0.046854499727487564,
-0.00038980128010734916,
-0.08071678876876831,
0.019428418949246407,
0.07824040949344635,
0.08929293602705002,
-0.... |
21 | Locality Sensitive Hashing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 5.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 5.2 Top-k Retrieval with LSH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 5.2.1 The Point Locatio... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.06896638125181198,
0.04813579469919205,
-0.06279457360506058,
-0.0837409496307373,
-0.011560834012925625,
-0.03605911135673523,
-0.011005761101841927,
0.009821712970733643,
0.0011355597525835037,
0.0407312735915184,
-0.017065299674868584,
-0.017824551090598106,
0.10143791884183884,
0.002... |
22 | . . . . . . . . . . . . . . . . . 63 5.3.2 Angular Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 5.3.3 Euclidean Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 5.3.4 Inner Product . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 5.4 Cl... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03123854659497738,
-0.027624079957604408,
-0.10023617744445801,
-0.10831616818904877,
-0.07006774097681046,
-0.00830104760825634,
-0.041605930775403976,
0.07296393066644669,
-0.0010040472261607647,
-0.03119245544075966,
0.06727925688028336,
0.03119531460106373,
0.02200525626540184,
0.01... |
23 | 4 Branch-and-Bound Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 4.2 k-dimensional Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 4.2.1 Complexity Analys... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.08821070194244385,
0.014553843066096306,
0.002040520077571273,
-0.005498294252902269,
0.016344746574759483,
-0.14167077839374542,
-0.03320794925093651,
0.006005150731652975,
-0.024929877370595932,
0.08458022773265839,
-0.09220726788043976,
-0.012007011100649834,
0.06462384015321732,
-0.0... |
24 | 4.3.1 Randomized Partition Trees . . . . . . . . . . . . . . . . . . . . . . . 38 4.3.2 Spill Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.4 Cover Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 4.4.1 The Abstract Cover Tree an... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0009664146928116679,
0.08614598959684372,
0.06939008086919785,
0.021005939692258835,
0.06650523841381073,
-0.05861086770892143,
0.023243147879838943,
0.017504913732409477,
-0.0331951305270195,
0.03503395989537239,
-0.051505688577890396,
-0.08958545327186584,
0.07535292208194733,
0.00599... |
27 | 6 Graph Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 6.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 6.1.1 The Research Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.2 The Delaunay Graph... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.015808027237653732,
0.04761739820241928,
-0.07077939808368683,
-0.07902055233716965,
-0.05199287459254265,
-0.08487291634082794,
-0.0917549878358841,
-0.0011948334285989404,
0.008131848648190498,
0.034282438457012177,
-0.0269539263099432,
-0.01749664731323719,
0.08138677477836609,
-0.002... |
28 | . . . . . . . . 78 6.2.3 Top-1 Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 6.2.4 Top-k Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 6.2.5 The k-NN Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82 6.2.6 The Case of Inner P... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.02910122089087963,
-0.015973806381225586,
-0.05897285416722298,
-0.05593785643577576,
-0.01726236566901207,
0.01666741818189621,
0.04013340175151825,
0.023860372602939606,
0.001027747057378292,
0.009537743404507637,
0.0035883665550500154,
0.017198141664266586,
0.06561904400587082,
-0.00... |
30 | 6.3.2 Extension to the Delaunay Graph . . . . . . . . . . . . . . . . . . 91 6.3.3 Approximation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 6.4 Neighborhood Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94 6.4.1 From SNG to α-SNG . . . . . . . . . . . . . . ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.012392439879477024,
-0.003384470706805587,
-0.04695972055196762,
-0.06357131898403168,
-0.031985845416784286,
-0.045236553996801376,
-0.059387657791376114,
0.05904778093099594,
-0.04448018968105316,
0.031471699476242065,
0.03890925645828247,
-0.0407014824450016,
0.11832541972398758,
0.0... |
31 | . 101 Sampling Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 8.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 8.2 Approximating the Ranks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112 8.2.1 Non-ne... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.00869055837392807,
0.03388203680515289,
-0.05720837041735649,
-0.05070389807224274,
-0.010803029872477055,
-0.06104155629873276,
-0.026370471343398094,
0.0792771652340889,
-0.04435807839035988,
0.020958008244633675,
-0.06380491703748703,
-0.016003379598259926,
0.08345901221036911,
-0.00... |
32 | Approximating the Scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 8.3.1 The BoundedME Algorithm . . . . . . . . . . . . . . . . . . . . . . . 118 8.3.2 Proof of Correctness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120 8.4 Closing Remarks . . . . . . . . . . . . . . . . . . . ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05124589428305626,
0.06697719544172287,
-0.08376659452915192,
-0.10421567410230637,
-0.07050614804029465,
-0.02387731522321701,
-0.03242901712656021,
0.029532182961702347,
-0.133991077542305,
-0.032172080129384995,
-0.05944495275616646,
0.026204604655504227,
0.1482958197593689,
-0.01140... |
34 | 9 Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 9.1 Vector Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127 9.1.1 Codebooks and Codewords . . . . . . . . . . . . . . . . . . . . . . . . 128 9.2 Product Quantization . ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.011041062884032726,
-0.016290083527565002,
-0.11292974650859833,
-0.1514391303062439,
-0.04431583732366562,
-0.0015442126896232367,
-0.014214362017810345,
0.06957582384347916,
-0.09193478524684906,
-0.060561493039131165,
0.02691020257771015,
-0.01545876543968916,
0.05571267753839493,
0.... |
35 | . . . . . . . . . . . . . . . . . . . . . 131 9.3 Additive Quantization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132 9.3.1 Distance Computation with AQ . . . . . . . . . . . . . . . . . . . . 133 9.3.2 AQ Encoding and Codebook Learning . . . . . . . . . . . . . . 133 9.4 Quantization for Inne... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04854113608598709,
-0.03624983876943588,
-0.1291932761669159,
-0.07835515588521957,
-0.07984954118728638,
0.04170890897512436,
0.006243867799639702,
0.04704274237155914,
-0.02544035203754902,
-0.04802446812391281,
0.016430767253041267,
-0.050608813762664795,
0.09789019078016281,
-0.0002... |
36 | Quantization... 0.0... .0 0. eee ene 9.1 Vector Quantization ...... 6. . eee eee eee 9.1.1 Codebooks and Codewords .............. 000 eee eee 9.2 Product Quantization ........ 0.0. e eee eee eee 9.2.1 Distance Computation with PQ .................04. 9.2.2 Optimized Product Quantization ................00. 9.2.3 Extens... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05493392050266266,
-0.022897779941558838,
-0.08199169486761093,
-0.11716713756322861,
-0.023224037140607834,
0.0307326540350914,
0.014280068688094616,
0.03544183447957039,
-0.027977708727121353,
-0.026807324960827827,
0.05584817752242088,
-0.040497101843357086,
0.010242987424135208,
0.0... |
37 | 10 Sketching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 10.1 Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 10.2 Linear Sketching with the JL Transform . . . . . . . . . . . . . . . . . . . 145 10.2.1 T... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.045712877064943314,
0.04285568743944168,
-0.0756872221827507,
-0.0816861093044281,
-0.06779514253139496,
-0.04936997964978218,
-0.05010710284113884,
0.0597977377474308,
-0.020757563412189484,
-0.03557591140270233,
0.001826039981096983,
-0.004120539873838425,
0.06508856266736984,
0.06809... |
38 | 10.3.2 Inner Product Approximation . . . . . . . . . . . . . . . . . . . . . . 151 10.3.3 Theoretical Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152 10.3.4 Fixing the Sketch Size . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 10.4 Sketching by Sampling . . . . . . . . . . . . . . ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.0036583358887583017,
0.018925994634628296,
-0.06778369843959808,
-0.15693029761314392,
-0.0752294734120369,
-0.03761908411979675,
-0.019522041082382202,
0.10276205837726593,
-0.039140500128269196,
-0.004606390371918678,
-0.03180231899023056,
-0.0041512721218168736,
0.07769577950239182,
0... |
41 | A B Collections ........... 000. References 0.0... cece teen eee ene Probability Review ....................0.0 00000 e eee eee B.1 Probability ........ 0.0... eee B.2 Random Variables .............00..00 000000 c cece eee eee B.3 Conditional Probability ..........0..00 0.00. c eee eee eee B.4 Independence B.5 Expectat... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0006443895981647074,
0.0022720603737980127,
-0.025516845285892487,
-0.10841649025678635,
-0.03403378650546074,
0.0426030158996582,
0.040829140692949295,
0.08800981193780899,
0.041082851588726044,
0.06907355040311813,
-0.01968643069267273,
-0.05372276529669762,
0.06751307845115662,
-0.05... |
43 | Part I Introduction
Chapter 1 Vector Retrieval
Abstract This chapter sets the stage for the remainder of this monograph. It explains where vectors come from, how they have come to represent data of any modality, and why they are a useful mathematical tool in machine learning. It then describes the structure we typicall... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.02329077757894993,
-0.041735127568244934,
0.02820531278848648,
-0.057680655270814896,
0.055978529155254364,
0.03286540508270264,
0.020317165181040764,
-0.027328429743647575,
0.044655315577983856,
-0.04398483783006668,
-0.07009658217430115,
0.043389298021793365,
0.06204709783196449,
0.05... |
44 | Vector representations of objects have long been an integral part of the machine learning literature. Indeed, a classifier, a regression model, or a rank- ing function learns patterns from, and acts on, vector representations of data. In the past, this vector representation of an object was nothing more than a collecti... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.013268604874610901,
0.011322765611112118,
0.005889520980417728,
0.0256276223808527,
0.07401195913553238,
0.04050099104642868,
-0.03339654952287674,
-0.023147623986005783,
0.09886211901903152,
0.030140578746795654,
-0.09769490361213684,
0.07622765749692917,
0.03693534433841705,
0.0482780... |
45 | Fig. 1.1: Vector representation of a piece of text by adopting a âbag of wordsâ view: A text document, when stripped of grammar and word order, can be thought of as a vector, where each coordinate represents a term in our vo- cabulary and its value records the frequency of that term in the document or some function... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.0035187196917831898,
-0.009307773783802986,
-0.048557739704847336,
-0.011027722619473934,
-0.06569674611091614,
0.0496780127286911,
0.004848519340157509,
-0.009044073522090912,
0.14586852490901947,
-0.002963896607980132,
0.05815724655985832,
0.010558153502643108,
0.09109143912792206,
0.0... |
46 | The advent of deep learning and, in particular, Transformer-based mod- els [Vaswani et al., 2017] brought about vector representations that are be- yond the elementary formation above. The resulting representation is often, as a single entity, referred to as an embedding, instead of a âfeature vector,â though the u... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.044363752007484436,
-0.06110183149576187,
0.0032211795914918184,
0.01573549583554268,
0.04165997356176376,
0.053757958114147186,
-0.024514900520443916,
0.0024742886889725924,
0.13454501330852509,
-0.0029807311948388815,
0.005619807634502649,
0.04557885602116585,
0.0588236078619957,
0.05... |
47 | 1.2 Vectors as Units of Retrieval
by many recent models of text [Bai et al., 2020, Formal et al., 2021, 2022, Zhuang and Zuccon, 2022, Dai and Callan, 2020, Gao et al., 2021, Mallia et al., 2021, Zamani et al., 2018, Lin and Ma, 2021] and has been shown to produce effective representations.
Vector representations of te... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05869710072875023,
-0.042472366243600845,
-0.017965348437428474,
0.03254537284374237,
0.05081940442323685,
0.0732152983546257,
0.009582625702023506,
0.03841031715273857,
0.09504764527082443,
0.00029074482154101133,
-0.0018853602232411504,
0.12092598527669907,
0.11409735679626465,
0.0727... |
48 | Unsurprisingly, the same embedding paradigm can be extended to other data modalities beyond text: Using deep learning models, one may embed images, videos, and audio recordings into vectors. In fact, it is even possible to project different data modalities (e.g., images and text) together into the same vector space and... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.020744111388921738,
-0.10193139314651489,
0.005899934563785791,
0.004067370668053627,
0.08595892786979675,
0.05270780622959137,
-0.03660174831748009,
-0.008739348500967026,
0.115626260638237,
-0.03433094918727875,
-0.03502591326832771,
0.050545431673526764,
0.04651661589741707,
0.075356... |
49 | That is the structure we desire: Similarity in the vector space must imply similarity between objects. So, as we engineer features to be extracted from an object, or design a protocol to learn a model to produce embeddings of data, we must choose the dimensionality d of the target space (a subset of Rd)
5
6
1 Vector Re... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04209327697753906,
-0.07068207859992981,
0.015064063481986523,
-0.0534755140542984,
0.010702250525355339,
0.022837746888399124,
0.011987002566456795,
0.000045617471187142655,
0.11260785162448883,
-0.04802033305168152,
0.047391947358846664,
0.011620764620602131,
0.0900319293141365,
0.064... |
50 | We should be able to make similar arguments given a semantic embedding of text documents. Again consider the sparse embeddings with d being the size of the vocabulary, and more concretely, take SPLADE [Formal et al., 2021] as a concrete example. This model produces real-valued sparse vectors in an inner product space. ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.08498440682888031,
-0.07722213119268417,
-0.08151712268590927,
-0.04853641614317894,
0.0033590581733733416,
-0.002886410104110837,
-0.05621619522571564,
-0.029056955128908157,
0.07878508418798447,
-0.007868598215281963,
-0.002289674710482359,
0.05571691319346428,
0.03979925438761711,
0.... |
51 | We are often interested in finding k objects that have the highest degree of similarity to a query object. When those objects are represented by vectors in a collection X , where the distance function δ(·, ·) is reflective of similarity, we may formalize this top-k question mathematically as finding the k minimizers... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.020962070673704147,
0.0527711920440197,
-0.018415026366710663,
-0.07005132734775543,
-0.05831335484981537,
0.0026036992203444242,
0.08236032724380493,
-0.018297402188181877,
0.04508192837238312,
0.03805817663669586,
-0.03198651224374771,
0.005071741994470358,
0.0768822729587555,
0.04291... |
52 | (k) arg min uâX δ(q, u). (1.1)
A web search engine, for example, finds the most relevant documents to your query by first formulating it as a top-k retrieval problem over a collection of (not necessarily text-based) vectors. In this way, it quickly finds the subset of documents from the entire web that may satisfy t... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.09050481766462326,
-0.011877438984811306,
-0.005498572252690792,
-0.029445305466651917,
0.039279088377952576,
-0.002716717077419162,
0.06163359433412552,
0.03732087463140488,
0.043551452457904816,
-0.005221473518759012,
-0.01501620002090931,
0.02583620697259903,
0.06914573907852173,
0.0... |
53 | # 1.3.1 Nearest Neighbor Search
In many cases, the distance function is derived from a proper metric where non-negativity, coincidence, symmetry, and triangle inequality hold for δ. A clear example of this is the L2 distance: δ(u, v) = â¥u â vâ¥2. The resulting problem, illustrated for a toy example in Figure 1.2... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05172181874513626,
0.000015149724276852794,
0.013282915577292442,
-0.11070050299167633,
0.016612296923995018,
0.016378013417124748,
0.016558658331632614,
-0.029922157526016235,
0.02862679399549961,
-0.0025767635088413954,
0.06137607619166374,
0.008771011605858803,
0.06103834509849548,
0... |
55 | Fig. 1.2: Variants of vector retrieval for a toy vector collection in R2. In Nearest Neighbor search, we find the data point whose L2 distance to the query point is minimal (v for top-1 search). In Maximum Cosine Similarity search, we instead find the point whose angular distance to the query point is minimal (v and p ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03492793068289757,
-0.010646933689713478,
-0.031672969460487366,
-0.11535945534706116,
0.020258547738194466,
0.0009183008223772049,
0.036442894488573074,
0.028406063094735146,
0.010897716507315636,
0.006102440413087606,
-0.02584146335721016,
0.09445614367723465,
-0.020758256316184998,
0... |
56 | (k) arg min uâX 1 â â¨q, uâ© â¥qâ¥2â¥uâ¥2 = (k) arg max uâX â¨q, uâ© â¥uâ¥2 . (1.3)
The latter is referred to as the k-Maximum Cosine Similarity (k-MCS) prob- lem. Note that, because the norm of the query point, â¥qâ¥2, is a constant in the optimization problem, it can simply be discarded; the resulti... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05777508765459061,
-0.03270501270890236,
-0.00620910432189703,
-0.09704368561506271,
-0.04417335242033005,
-0.01143337320536375,
0.06367073208093643,
0.023575015366077423,
0.004171045031398535,
0.01902770809829235,
0.018057208508253098,
0.07775572687387466,
0.08652482181787491,
0.018984... |
57 | (k) arg max uâX â¨q, uâ©. (1.4)
This is easy to see for k-MCS: If, in a pre-processing step, we L2-normalized all vectors in X so that u is transformed to uâ² = u/â¥uâ¥2, then â¥uâ²â¥2 = 1 and therefore Equation (1.3) reduces to Equation (1.4).
As for a reduction of k-NN to k-MIPS, we can expand Equation (1.2... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05245465412735939,
0.026351412758231163,
0.04398738220334053,
-0.02531496435403824,
-0.07511396706104279,
-0.01541592925786972,
0.07582131028175354,
-0.0243956558406353,
0.0020480758976191282,
0.05633387342095375,
0.027922281995415688,
0.038275640457868576,
0.05785007402300835,
-0.00100... |
58 | The k-MIPS problem, illustrated on a toy collection in Figure 1.2(c), does not come about just as the result of the reductions shown above. In fact, there exist embedding models (such as Splade, as discussed earlier) that learn vector representations with respect to inner product as the distance function. In other word... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.09879053384065628,
-0.03384720906615257,
0.050686631351709366,
-0.12209776043891907,
-0.062066979706287384,
0.017552882432937622,
0.06545042246580124,
-0.01497158408164978,
0.04231075569987297,
0.05669262260198593,
-0.02024511620402336,
0.07994783669710159,
0.05745907127857208,
0.040724... |
59 | As an example, suppose v and p = αv for some α > 1 are vectors in the collection X âa case demonstrated in Figure 1.2(c). Clearly, we have that
9
10
1 Vector Retrieval
â¨v, pâ© = αâ¨v, vâ© > â¨v, vâ©, so that p (and not v) is the solution to MIPS1 for the query point v.
In high-enough dimensions and under ce... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.11209141463041306,
-0.07522580027580261,
0.06501515954732895,
-0.038434360176324844,
0.08094536513090134,
-0.014559488743543625,
0.0942380428314209,
0.055457212030887604,
0.09707480669021606,
-0.006660775747150183,
0.09003590047359467,
0.10699146240949631,
0.10072891414165497,
0.0130060... |
60 | lim P [u = argmax(u, v)] = 1. doo VEX
Proof. Denote by Var[·] and E[·] the variance and expected value operators. By the conditions of the theorem, it is clear that E[â¨u, uâ©] = d E[Z 2] where Z is the random variable that generates each coordinate of the vector. We can also see that E[â¨u, Xâ©] = 0 for a random... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.026511112228035927,
-0.0030566358473151922,
0.07365849614143372,
-0.05231458321213722,
0.021729622036218643,
-0.017783552408218384,
0.16049742698669434,
-0.0007264871383085847,
0.019605174660682678,
0.003515689866617322,
-0.003113171085715294,
0.09826241433620453,
0.08181045949459076,
-... |
61 | Let us turn to the last term and bound the probability for a random data point:
P [(u,X) > (u,u)] =P [ (u, X) â (u,u) + dE[Z?] > dE[Z?]}. Y
The expected value of Y is 0. Denote by Ï2 its variance. By the application of the one-sided Chebyshevâs inequality,2 we arrive at the following bound:
2 P [lu X) > (0) < rapa... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03857160732150078,
0.02229391038417816,
0.0521097332239151,
-0.03735755383968353,
0.020378129556775093,
-0.056366026401519775,
0.1025671437382698,
0.06273411959409714,
0.06215796619653702,
0.010035376064479351,
0.01611991412937641,
0.07966713607311249,
0.10958825796842575,
-0.0501400493... |
62 | Frver-TasB Ee . NQ-ESPIADE + NQ-TasB GLoVp-200. QuoraâSpLAaDE GioVE-100 ° > Quora-MintLM Quora-ESPLADE FEVER-MINILM . GLoYE-50 ° 2.0. NQ-MintLM GI OVE-25 0.0 : 102 10° 104 DIMENSIONALITY (d)
(a) Synthetic (b) Real
Fig. 1.3: Probability that u â X is the solution to MIPS over X with query u versus the dimension... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.10756783932447433,
-0.003466956317424774,
0.007395423017442226,
0.02084311842918396,
0.01575806550681591,
-0.07619137316942215,
0.04823679476976395,
0.06581494957208633,
0.08531920611858368,
0.03093722090125084,
0.04504721984267235,
-0.044609617441892624,
0.07997732609510422,
-0.0253922... |
63 | # 1.3.3.2 Empirical Demonstration of the Lack of Coincidence
Let us demonstrate the effect of Theorem 1.1 empirically. First, let us choose distributions that meet the requirements of the theorem: a Gaussian dis- tribution with mean 0 and variance 1, and a uniform distribution over [â 12/2] (with variance 1) will do.... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.1075771376490593,
-0.05880337208509445,
0.07644131779670715,
-0.004751414991915226,
0.08716475963592529,
-0.06232362613081932,
0.05807049944996834,
0.0011582557344809175,
0.0687398836016655,
0.03477058932185173,
0.08167367428541183,
0.05583132430911064,
0.1346946656703949,
-0.0410184226... |
64 | Fig. 1.4: Approximate variants of top-1 retrieval for a toy collection in R2. In NN, we admit vectors that are at most ϵ away from the optimal solution. As such, x and y are both valid solutions as they are in a ball with radius (1 + ϵ)δ(q, x) centered at the query. Similarly, in MCS, we accept a vector (e.g., x) if... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05915653705596924,
-0.04082754999399185,
0.02240675687789917,
-0.05920232832431793,
0.049774136394262314,
0.013460217975080013,
0.09509672224521637,
0.0908144861459732,
0.025947105139493942,
0.04152267426252365,
-0.013222647830843925,
0.10599783807992935,
0.09923454374074936,
0.03190486... |
65 | of that ratio to 1. So it appears that the requirements of Theorem 1.1 are more forgiving than one may imagine.
We also repeat the exercise above on several real-world collections, a de- scription of which can be found in Appendix A along with salient statistics. The results of these experiments are visualized in Figur... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.1258506178855896,
-0.02675354853272438,
0.024909278377890587,
-0.0528077706694603,
0.025100599974393845,
-0.043078966438770294,
0.030325762927532196,
0.036468926817178726,
0.0648341029882431,
0.036409635096788406,
-0.03714509680867195,
0.08649413287639618,
0.09089607745409012,
0.0112770... |
66 | 1.4 Approximate Vector Retrieval
The first case of solving the problem exactly but inefficiently is uninterest- ing: If we are looking to find the solution for k = 1, for example, it is enough to compute the distance function for every vector in the collection and the query, resulting in linear complexity. When k > 1, ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0614820271730423,
0.04981023073196411,
-0.013534418307244778,
-0.06293746829032898,
-0.00954881589859724,
-0.020482398569583893,
0.04630666226148605,
-0.02332422137260437,
0.022525789216160774,
0.06961636245250702,
-0.008421266451478004,
0.01415094081312418,
0.10246999561786652,
-0.0162... |
67 | # Figure 1.4 renders the solution space for an example collection in R2.
The formalism above extends to the more general case where k > 1 in an obvious way: a vector u is a valid solution to the ϵ-approximate top-k problem if its distance to the query point is at most (1 + ϵ) times the distance to the k-th optimal ve... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07134401798248291,
0.059494104236364365,
-0.009496081620454788,
-0.07233630865812302,
0.012514876201748848,
-0.025609778240323067,
0.09339405596256256,
0.0026967304293066263,
0.03430838882923126,
0.09601151943206787,
-0.02532600611448288,
0.05630594491958618,
0.12981322407722473,
-0.008... |
68 | such that for all u â S, Equation (1.5) is satisfied where uâ is the k-th optimal vector obtained by solving the problem in Definition 1.1.
Despite the extension to top-k above, it is more common to characterize the effectiveness of an approximate top-k solution as the percentage of correct vectors that are present... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.051880646497011185,
0.028368720784783363,
-0.04406881704926491,
-0.060726866126060486,
0.013101322576403618,
0.026023849844932556,
0.07135932147502899,
0.05747511237859726,
0.031922776252031326,
-0.008074992336332798,
-0.04659610241651535,
0.06454163789749146,
0.10088590532541275,
0.006... |
69 | S. Bruch, S. Gai, and A. Ingber. An analysis of fusion functions for hybrid retrieval. ACM Transactions on Information Systems, 42(1), 8 2023.
T. Chen, M. Zhang, J. Lu, M. Bendersky, and M. Najork. Out-of-domain semantics to the rescue! zero-shot hybrid retrieval models. In Advances in Information Retrieval: 44th Europ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04412180185317993,
-0.023771198466420174,
-0.022113032639026642,
0.00495172431692481,
0.03268251195549965,
0.02174868993461132,
0.08879926800727844,
0.08666957914829254,
0.08950357139110565,
-0.047656286507844925,
0.03200674057006836,
0.004781621042639017,
0.09105429798364639,
0.0614318... |
70 | L. Gao, Z. Dai, and J. Callan. COIL: revisit exact lexical match in informa- tion retrieval with contextualized inverted list. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Compu- tational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021, pag... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.045806944370269775,
-0.07219419628381729,
0.04832688346505165,
0.025010352954268456,
0.11806074529886246,
-0.021640386432409286,
0.06876717507839203,
0.05121474713087082,
0.07034063339233398,
-0.003910101018846035,
-0.036065809428691864,
-0.0643487349152565,
0.0018394801300019026,
0.047... |
71 | S. Kuzi, M. Zhang, C. Li, M. Bendersky, and M. Najork. Leveraging semantic and lexical matching to improve the recall of document retrieval systems: A hybrid approach, 2020.
J. Lin and X. Ma. A few brief notes on deepimpact, coil, and a conceptual framework for information retrieval techniques, 2021.
J. Lin, R. Nogueir... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07474718242883682,
-0.010043054819107056,
-0.005316592287272215,
0.033361297100782394,
-0.006986494641751051,
0.05578583478927612,
-0.046037863940000534,
0.06677499413490295,
0.08213912695646286,
-0.01585490256547928,
-0.0007083821692503989,
0.08442666381597519,
0.06171776354312897,
0.0... |
72 | N. Reimers and I. Gurevych. Sentence-bert: Sentence embeddings using siamese bert-networks. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, 11 2019.
G. Salton and C. Buckley. Term-weighting approaches in automatic text retrieval. Info... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0625489354133606,
-0.025969505310058594,
0.061773162335157394,
0.07415316998958588,
0.07507216185331345,
0.06557135283946991,
0.0513618141412735,
0.044350720942020416,
0.05560460314154625,
-0.05351308733224869,
-0.0033810518216341734,
-0.006627120543271303,
0.05658361315727234,
0.032315... |
73 | S. Wang, S. Zhuang, and G. Zuccon. Bert-based dense retrievers require interpolation with bm25 for effective passage retrieval. In Proceedings of the 2021 ACM SIGIR International Conference on Theory of Information Retrieval, page 317â324, 2021.
X. Wu, R. Guo, D. Simcha, D. Dopson, and S. Kumar. Efficient inner produ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0746244266629219,
-0.09630469232797623,
0.05267675966024399,
0.004199831280857325,
-0.030009562149643898,
0.02977278083562851,
0.03180518373847008,
0.056982602924108505,
0.06127367541193962,
-0.014699502848088741,
-0.010814085602760315,
0.07750289142131805,
0.09426698088645935,
0.060178... |
74 | S. Zhuang and G. Zuccon. Fast passage re-ranking with contextualized exact term matching and efficient passage expansion. In Workshop on Reaching Efficiency in Neural Information Retrieval, the 45th International ACM
15
16
1 Vector Retrieval
SIGIR Conference on Research and Development in Information Retrieval, 2022.
#... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.028377030044794083,
-0.05988652631640434,
-0.003741517663002014,
-0.02769790031015873,
0.045278456062078476,
0.050237059593200684,
0.040363650768995285,
0.01489508431404829,
0.026432475075125694,
-0.05639840289950371,
-0.031422801315784454,
0.09395786374807358,
0.027607353404164314,
0.0... |
75 | # 2.1 Intuition
Consider the case of proper distance functions where δ(·, ·) is a metric. Recall from Equation (1.5) that a vector u is an acceptable ϵ-approximate solution if its distance to the query q according to δ(·, ·) is at most (1 + ϵ)δ(q, uâ), where uâ is the optimal vector and ϵ is an arbitrary ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.005436171777546406,
-0.022661039605736732,
0.011049265041947365,
-0.07039614766836166,
0.045968733727931976,
0.08008642494678497,
0.08424072712659836,
0.027022428810596466,
0.06928989291191101,
0.014046180993318558,
-0.029634032398462296,
0.04519158974289894,
0.06773050874471664,
-0.0078... |
76 | 17
18
2 Retrieval Stability in High Dimensions
points anyway, reducing thereby to a procedure that performs more poorly than exhaustive search.
That sounds troubling. But when might we experience that phenomenon? That is the question Beyer et al. [1999] investigate in their seminal paper.
It turns out, one scenario whe... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04370899870991707,
-0.02192653901875019,
-0.028067687526345253,
0.020742716267704964,
0.0838201642036438,
0.011073137633502483,
-0.018158050253987312,
-0.020010007545351982,
0.0417114682495594,
-0.04233676567673683,
-0.018218008801341057,
0.05727248266339302,
0.046616677194833755,
-0.00... |
77 | # 2.2 Formal Results
More generally, vector retrieval becomes unstable in high dimensions when the variance of the distance between query and data points grows substan- tially more slowly than its expected value. That makes sense. Intuitively, that means that more and more data points fall into the (1 + ϵ)-enlarged ba... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.014641729183495045,
-0.028648236766457558,
0.032122522592544556,
-0.03957737609744072,
0.03961082547903061,
0.05149278789758682,
0.07068707048892975,
0.032393623143434525,
0.08119986206293106,
0.028804797679185867,
0.03262784704566002,
0.10963643342256546,
0.07418207824230194,
-0.060574... |
78 | Proof. Let 5, = maxy,cx 6(q,u) and 6* = min,<x 6(q,u). If we could show that, for some d-dependent positive a and 8 such that B/a = 1+, limgoo P [a <<< 8] = 1, then we are done. That is because, in that case 6,/5* < 8/a = 1+ ⬠almost surely and the claim follows.
2.2 Formal Results
From the above, all that we need to... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04928421229124069,
0.01967712864279747,
0.07476241141557693,
-0.0699935033917427,
-0.009007728658616543,
0.08041637390851974,
0.06947378069162369,
0.06011701375246048,
0.11356588453054428,
-0.009872735477983952,
0.03158662095665932,
-0.01552604977041483,
0.06862927973270416,
0.040668863... |
79 | lim P [a < 6* <6, <6\= doo lim P [5(a, u) â¬[a,B] Yue x| = doo Jim P{(1ân) E[6(a, X)] < 6(g,u) < 0 +m) BIG, XI] Vue a] = Jim P {(6(a.w) â El6(4,X)]| < nE[g,X)] Vue 4]. 00
It is now easier to work with the complementary event:
1â lim P [3 ue Xs.t. |5(q,u) â E[6(q, X)]| > 7 B(6(4,X)]]. doo
Using the Union Bound,... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.01745217852294445,
0.023209072649478912,
0.04862247034907341,
-0.06427543610334396,
-0.014855152927339077,
0.07921208441257477,
0.07772434502840042,
-0.03411250188946724,
0.07480508089065552,
0.03786838427186012,
0.03877497464418411,
-0.03005094639956951,
0.0014144888846203685,
0.006933... |
80 | Note that, q is independent of data points and that data points are iid ran- dom variables. Therefore, δ(q, u)âs are random variables drawn iid as well. Furthermore, by assumption E[δ(q, X)] exists, making it possible to apply Markovâs inequality to obtain the following bound:
lim P [a < 6* <6, < 6] > doo 1 Jim |... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.011108355596661568,
0.028655817732214928,
0.022981960326433182,
-0.09748353809118271,
-0.003961330279707909,
0.08118090778589249,
0.07709454745054245,
0.028498120605945587,
0.04601382464170456,
0.04657106474041939,
-0.04100717604160309,
0.06663313508033752,
0.12702564895153046,
-0.05981... |
81 | We mentioned earlier that if data and query points are independent of each other and that vectors are drawn iid in each dimension, then vector retrieval becomes unstable. For NN with the Z, norm, it is easy to show that such a configuration satisfies the conditions of Theorem 2.1, hence the instability. Consider the fo... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.08109816908836365,
-0.025714542716741562,
0.025925874710083008,
0.023874830454587936,
-0.012931679375469685,
0.019119972363114357,
0.07742992043495178,
-0.008683714084327221,
0.049155548214912415,
-0.03882509097456932,
0.05927605554461479,
0.0714670792222023,
0.09909526258707047,
-0.067... |
82 | When δ(q, u) = ââ¨q, uâ©, the same conditions result in retrieval instability:
Var [(q u)| _ Var [ Y: qui] im 3 lim 5 do El(qu)) 4° ELD, qui] li 2 Var [aie] (by independence) a (SO, E [aiui]) lo? Be
where we write Ï2 = Var[qiui] and µ = E[qiui].
# 2.3 Empirical Demonstration of Instability
Let us examine the th... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.06138790771365166,
-0.06241108477115631,
0.0075849792920053005,
0.027570540085434914,
0.018264232203364372,
0.0034593003802001476,
0.028561796993017197,
0.020078301429748535,
0.05850391462445259,
-0.048845324665308,
0.018751395866274834,
0.03818807378411293,
0.1343819946050644,
-0.06054... |
83 | 2.3 Empirical Demonstration of Instability
21
G00 0.1) --©-Exp(1) â>-U(0, V2) 10! 10° DIMENSIONALITY (d)
100 iS} E s0 5 n t 60 B 3 = 40 5 a inf ->-64 & 20 âe128 < $256 3 0 12 0 25 50 75 100 ⬠AS PERCENT 6°
(a) 뫉/뫉 (b) Percent Approximate Solutions
Fig. 2.1: Simulation results for Theorem 2.1 applied ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.06383892148733139,
-0.06455398350954056,
0.039958447217941284,
-0.033309757709503174,
0.04728037863969803,
-0.048702288419008255,
0.00537495780736208,
0.07491561025381088,
0.00536866532638669,
-0.011302999220788479,
0.03007376194000244,
0.03153170272707939,
0.10338879376649857,
-0.06737... |
84 | that, as d increases, we can find a smaller ϵ such that nearly all data points fall within (1 + ϵ)δâ distance from the query. The results of our experiments confirm this phenomenon; we have plotted the results for the Gaussian dis- tribution in Figure 2.1(b).
# 2.3.1 Maximum Inner Product Search
In the discussion ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07014987617731094,
-0.08067227154970169,
-0.009752804413437843,
-0.05354878678917885,
0.008000113070011139,
-0.0023891557939350605,
0.04941600561141968,
0.0491936095058918,
0.046786028891801834,
-0.02176259458065033,
0.02574806660413742,
0.12218356132507324,
0.11445120722055435,
-0.0518... |
85 | jm, P[(a.X) >] =0.
Proof. By spherical symmetry, it is easy to see that E[â¨q, Xâ©] = 0. The vari- ance of the inner product is then equal to E[â¨q, Xâ©2], which can be expanded as follows.
First, find an orthogonal transformation Î : Rd â Rd that maps the query point q to the first standard basis (i.e., e1 = [1... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03714456781744957,
0.030665287747979164,
0.0791417583823204,
-0.08168817311525345,
-0.06478021293878555,
0.038185324519872665,
0.11876529455184937,
-0.026180325075984,
0.11723725497722626,
0.06402342766523361,
0.05418245494365692,
0.06600239872932434,
0.08387935906648636,
-0.01876332052... |
86 | The proof of Theorem 2.2 tells us that the variance of inner product grows as a function of 1/d and â¥Xâ¥2 2. So if our vectors have bounded norms, then we can find a d such that inner products are arbitrarily close to 0. This is yet another reason that approximate MIPS could become meaningless. But if our data point... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.031426701694726944,
-0.08162100613117218,
0.08238063752651215,
-0.020635168999433517,
0.029263310134410858,
-0.010427461005747318,
0.0038205923046916723,
0.05281021445989609,
0.0416334830224514,
0.023979397490620613,
0.033393025398254395,
0.156448096036911,
0.06975295394659042,
-0.01660... |
87 | 1 A distribution is spherically symmetric if it remains invariant under an orthogonal trans- formation.
Chapter 3 Intrinsic Dimensionality
Abstract We have seen that high dimensionality poses difficulties for vector retrieval. Yet, judging by the progression from hand-crafted feature vectors to sophisticated embeddings... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05993448942899704,
-0.07070969045162201,
0.05344844236969948,
-0.03980814293026924,
0.014070187695324421,
0.02337314933538437,
0.02392524667084217,
-0.015811322256922722,
0.08178821951150894,
-0.0308701004832983,
0.043311867862939835,
0.09036560356616974,
0.016496336087584496,
0.0448760... |
88 | # 3.1 High-Dimensional Data and Low-Dimensional Manifolds
We talked a lot about the difficulties of answering ϵ-approximate top-k ques- tions in high dimensions. We said, in certain situations, the question itself becomes meaningless and retrieval falls apart. For MIPS, in particular, we argued in Theorem 2.2 that poi... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05526425689458847,
-0.08060772716999054,
0.06000914424657822,
-0.028174610808491707,
0.007435452658683062,
0.036230988800525665,
0.01877765543758869,
-0.0016592590836808085,
0.08311951905488968,
-0.029939750209450722,
0.01431961078196764,
0.09122568368911743,
0.03851907700300217,
0.0197... |
89 | 23
24
3 Intrinsic Dimensionality
set does not change whether the unused dimensions are taken into account or the vectors corrected to lie in Rd⦠.
Other times the answer is intuitive but not so obvious. When a text doc- ument is represented as a sparse vector, all the documentâs information is contained entirely in... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.02602369710803032,
-0.03936724364757538,
-0.016784079372882843,
0.01562732644379139,
0.04553345590829849,
-0.022130830213427544,
-0.026060035452246666,
-0.0035521152894943953,
0.16331005096435547,
-0.010284862481057644,
0.04254256188869476,
0.04363631829619408,
0.0693155899643898,
-0.00... |
90 | In the context of vector retrieval, too, the concept of intrinsic dimensional- ity often plays an important role. Knowing that data points have a low intrin- sic dimensionality means we may be able to reduce dimensionality without (substantially) losing the geometric structure of the data, including inter- point distan... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.06667397171258926,
-0.05880863964557648,
0.021348288282752037,
-0.021322760730981827,
0.052003562450408936,
-0.01442716084420681,
0.020812107250094414,
0.027018317952752113,
0.10536978393793106,
-0.05987779423594475,
-0.008020875044167042,
0.022435540333390236,
0.014277842827141285,
0.0... |
91 | 3.2 Doubling Measure and Expansion Rate
this ball by a factor 2, and count again. The count of data points in a âgrowth- restrictedâ point set should increase smoothly, rather than suddenly, as we make this ball larger.
In other words, data points âcome into view,â as Karger and Ruhl [2002] put it, at a constan... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.04442646726965904,
-0.13630779087543488,
0.005363043397665024,
-0.007034842390567064,
0.0004993328475393355,
0.008600418455898762,
-0.003917829133570194,
-0.003974088001996279,
0.11796017736196518,
0.005911147687584162,
0.07728996127843857,
0.021638408303260803,
0.04951617121696472,
0.0... |
92 | One can think of the expansion rate d⦠as a dimension of sorts. In fact, as we will see later, several works [Dasgupta and Sinha, 2015, Karger and Ruhl, 2002, Beygelzimer et al., 2006] use this notion of intrinsic dimensionality to design algorithms for top-k retrieval or utilize it to derive performance guarantees f... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.006233980413526297,
-0.04655129089951515,
-0.04359908401966095,
0.00655429856851697,
0.08020427078008652,
-0.03524380549788475,
-0.012915462255477905,
0.005190653260797262,
0.12386495620012283,
-0.017203861847519875,
0.0024722181260585785,
0.02244061976671219,
0.029855942353606224,
0.038... |
93 | What happens if we added the origin to the set, so that our set becomes {0} ⪠X ? If we chose 0 as the center of the ball, and set its radius to r, we have a single point in the resulting ball. The moment we double r, the resulting ball will contain the entire set! In other words, the expansion rate of the updated se... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.0013125125551596284,
-0.06488402932882309,
-0.014021550305187702,
-0.017085803672671318,
0.002344901440665126,
-0.01577695459127426,
0.02845313772559166,
0.014786085113883018,
0.07821500301361084,
-0.03772325441241264,
0.018322505056858063,
0.01733487844467163,
0.019138596951961517,
-0.0... |
94 | The base 2 in the definition above can be replaced with any other constant k: The doubling dimension of X is d⦠if the intersection of any ball of radius r with the set can be covered by O(kd⦠) balls of radius r/k. Furthermore, the definition can be easily extended to any metric space, not just Rd with the Euclide... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.011679577641189098,
-0.06778586655855179,
0.013624866493046284,
-0.06187820807099342,
0.023614030331373215,
0.03725442290306091,
-0.02395685762166977,
0.019869454205036163,
0.05382673442363739,
-0.02802840620279312,
0.012287628836929798,
0.010006704367697239,
-0.0100177600979805,
-0.0019... |
95 | By definition of the expansion rate, for every v â S:
|B(u, 4r)| < |B(v, 8r)| < gadct Biv, 5).
Because the balls B(v, r/2) for all v â S are disjoint, it follows that |S| ⤠24dkr ⦠many balls of radius r cover B(u, 2r). That concludes the ââ proof.
The doubling dimension and expansion rate both quantify the... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.014044372364878654,
-0.05929812043905258,
-0.0022656996734440327,
-0.032165899872779846,
0.040119390934705734,
0.0008271826663985848,
0.01945379376411438,
0.010055634193122387,
0.08850084990262985,
-0.04447265714406967,
0.052781060338020325,
0.023521438241004944,
0.021279504522681236,
0... |
96 | # 3.3.1 Properties of the Doubling Dimension
It is helpful to go over a few concrete examples of point sets with bounded doubling dimension in order to understand a few properties of this definition of intrinsic dimensionality. We will start with a simple example: a line segment in Rd with the Euclidean norm.
If the se... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.009337836876511574,
-0.046109639108181,
0.03261558711528778,
0.008349352516233921,
0.018478592857718468,
0.022045588120818138,
0.08194268494844437,
0.05414113774895668,
0.012397698126733303,
-0.03796613588929176,
-0.011777042411267757,
-0.012240225449204445,
0.0005390258156694472,
0.0030... |
97 | The lemma above tells us that the doubling dimension of a set in the Euclidean space is at most some constant factor larger than the natural di- mension of the space; note that this was not the case for the expansion rate. Another important property that speaks to the stability of the doubling di- mension is the follow... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.015348056331276894,
-0.07898831367492676,
-0.005322118755429983,
-0.031449560075998306,
0.014730993658304214,
0.007706916891038418,
0.027830958366394043,
0.03747575730085373,
0.020069027319550514,
-0.05310589075088501,
0.004328552167862654,
0.024245768785476685,
0.005564556922763586,
-0... |
98 | One consequence of the previous two lemmas is the following statement concerning sparse vectors:
27
28
3 Intrinsic Dimensionality
Lemma 3.5 Suppose that X C R¢@ is a collection of sparse vectors, each having at most n non-zero coordinates. Then the doubling dimension of X is at most Ck + klogd for some constant C. Pro... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.06424618512392044,
-0.07513993233442307,
0.025178615003824234,
-0.04599398374557495,
0.051304884254932404,
0.06683147698640823,
-0.013292131945490837,
0.03358102962374687,
0.024458875879645348,
-0.03323144465684891,
0.03136240318417549,
0.014693161472678185,
0.014034837484359741,
0.0164... |
99 | S. Dasgupta and K. Sinha. Randomized partition trees for nearest neighbor search. Algorithmica, 72(1):237â263, 5 2015.
A. Gupta, R. Krauthgamer, and J. Lee. Bounded geometries, fractals, and low-distortion embeddings. In 44th Annual IEEE Symposium on Founda- tions of Computer Science, pages 534â543, 2003.
D. R. Kar... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.07453049719333649,
0.006316435057669878,
0.06467027962207794,
-0.021902387961745262,
0.10377302765846252,
-0.013850551098585129,
-0.008225767873227596,
-0.009133060462772846,
0.0671292170882225,
-0.031921498477458954,
-0.07427100837230682,
0.016747817397117615,
0.03685309737920761,
0.01... |
100 | # 4.1 Intuition
Suppose there was some way to split a collection X into two sub-collections, Xl and Xr, such that X = Xl ⪠Xr and that the two sub-collections have roughly the same size. In general, we can relax the splitting criterion so the two sub-collections are not necessarily partitions; that is, we may have Xl... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.06670627743005753,
-0.02874993532896042,
0.049667295068502426,
-0.029348280280828476,
0.09584749490022659,
-0.047157127410173416,
0.00023061182582750916,
-0.011467318050563335,
0.08908937871456146,
0.038105856627225876,
-0.0043059890158474445,
-0.011642972938716412,
0.0450364388525486,
... |
102 | Fig. 4.1: Illustration of a general branch-and-bound method on a toy collec- tion in R2. In (a), Rl and Rr are separated by the dashed line h. The distance between query q and the closest vector in Rl is less than the distance be- tween q and h. As such, we do not need to search for the top-1 vector over the points in ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.004108206368982792,
0.0966232419013977,
-0.007170792203396559,
-0.06407986581325531,
0.06549493223428726,
-0.008114638738334179,
0.013994010165333748,
0.07580757886171341,
0.040322013199329376,
0.03255185857415199,
-0.029506195336580276,
0.026656601577997208,
0.0671175941824913,
0.042526... |
103 | l ) < δ(q, Rr)1 then we have found the optimal point and do not need to search the data points in Xr at all! That is because, the δ-ball2 centered at q with radius δ(q, uâ l ) is contained entirely in Rl, so that no point from Rr can have a shorter distance to q than uâ l . Refer again to Figure 4.1(a) for an il... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.022965673357248306,
0.043824292719364166,
0.009251768700778484,
-0.10246510803699493,
0.08980365097522736,
0.008359933272004128,
0.03651610016822815,
0.043156158179044724,
0.058791812509298325,
0.03581535443663597,
-0.03573087975382805,
0.01437753438949585,
0.0864347591996193,
0.03655945... |
104 | 4.2 k-dimensional Trees
branch or search it certifies that uâ l top-1 problem exactly. is indeed optimal, thereby solving the
We can extend the framework above easily by recursively splitting the two sub-collections and characterizing the regions containing the resulting partitions. This leads to a (balanced) binary ... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.0310957133769989,
0.044794272631406784,
0.05102044716477394,
-0.014689045026898384,
0.050755999982357025,
-0.031157197430729866,
-0.0015694681787863374,
0.006087716668844223,
0.030044300481677055,
0.055054791271686554,
-0.06558971852064133,
-0.03484587371349335,
0.0418529212474823,
0.021... |
105 | The above is the logic that is at the core of branch-and-bound algorithms for top-k retrieval [Dasgupta and Sinha, 2015, Bentley, 1975, Ram and Sinha, 2019, Ciaccia et al., 1997, Yianilos, 1993, Liu et al., 2004, Panigrahy, 2008, Ram and Gray, 2012, Bachrach et al., 2014]. The specific instances of this framework diffe... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.0059028188697993755,
-0.019733956083655357,
0.006195682566612959,
-0.049246806651353836,
0.012585309334099293,
-0.0355168879032135,
0.003758159000426531,
-0.010739165358245373,
0.06794131547212601,
0.04863259196281433,
-0.02546723000705242,
0.01843351311981678,
0.04786847531795502,
0.06... |
106 | Let us consider its simplest construction for X â Rd. The root of the tree is a node that represents the entire space, which naturally contains the entire data collection. Assuming that the size of the collection is greater than 1, we follow a simple procedure to split the node: We select one coordinate axis and part... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.05956227704882622,
0.01553567685186863,
0.0695565938949585,
-0.07614904642105103,
0.09197447448968887,
-0.11685570329427719,
-0.06514645367860794,
0.01877395436167717,
-0.005796532612293959,
0.07258135825395584,
-0.05515993759036064,
-0.012590530328452587,
0.054995715618133545,
-0.02513... |
107 | # 4.2.1 Complexity Analysis
The k-d Tree data structure is fairly simple to construct. It is also efficient: Its space complexity given a set of m vectors is Î(m) and its construction time has complexity Î(m log m).3
The search algorithm, however, is not so easy to analyze in general. Fried- man et al. [1977] claimed... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.026104090735316277,
0.0029937357176095247,
0.04391152784228325,
-0.017173273488879204,
0.05744483694434166,
-0.04701383784413338,
0.008038116618990898,
-0.08622916787862778,
0.03496425971388817,
0.08123166859149933,
-0.08448494225740433,
0.03395283222198486,
0.04800291359424591,
-0.07283... |
108 | Let δâ = minuâX â¥q â uâ¥2 be the optimal distance to a query q. Consider the ball of radius δâ centered at q and denote it by B(q, δâ). It is easy to see that the number of leaves we may need to visit in order to certify an initial candidate is upper-bounded by the number of leaf regions (i.e., d-dimens... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
0.06296845525503159,
-0.010131577961146832,
-0.03326573595404625,
0.01824120804667473,
0.005880109034478664,
0.002708285115659237,
-0.02319761924445629,
0.0027203229255974293,
0.07870286703109741,
0.050255514681339264,
-0.041097242385149,
-0.005943095777183771,
0.06185637786984444,
0.00196... |
109 | 3 The time to construct the tree depends on the complexity of the subroutine that finds the median of a set of values.
4.2 k-dimensional Trees
where G(d) is the ratio between the volume of the hypercube that contains B(q, 뫉) and the volume of B(q, 뫉) itself. Because G(d) is independent of m, and because visitin... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.013668050058186054,
0.007248689886182547,
0.05361383780837059,
-0.020199468359351158,
0.044406116008758545,
-0.09431851655244827,
-0.039764489978551865,
0.0084858862683177,
0.02392108365893364,
0.024693015962839127,
-0.05592568218708038,
0.007405861280858517,
0.048912402242422104,
-0.03... |
110 | G(d) = 2d(d/2)! Ïd/2 . (4.2)
Plugging this back into Equation (4.1), we arrive at:
(1+ Ga)" = (1 S(ay2y" ~ ou 2 OU)" = O(a)" d â)! 3)! dtl z ) )s
where in the third equality we used Stirlingâs formula, which approximates n! as
(5)! x Vmax L 4"? 1 asi = VF Gear * =O(d*).
The above shows that, the number of leaves t... | Foundations of Vector Retrieval | https://arxiv.org/abs/2401.09350 | Foundations of Vector Retrieval arXiv:2401.09350 | [
-0.03361314907670021,
-0.03814195841550827,
-0.00849116686731577,
-0.01339639164507389,
0.029893340542912483,
-0.07605033367872238,
0.0005593688110820949,
0.0033818313386291265,
0.00001611988955119159,
0.004647644702345133,
0.003275937167927623,
0.0575137697160244,
0.08533445000648499,
-0.... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.