Abstract:
A method for providing search results to a user is disclosed. The method includes receiving a first set of information associated with a plurality of web pages. A second set of information associated with a user preference, determining a commercial score for each web page is also received. A subset of the first set of information is determined based on the second set of information. A visual indicator for the subset of the first set of information is generated in accordance with a commercial score, and the subset and the visual indicator are displayed on a display.
Abstract:
Systems and methods for behavioral modeling to optimize shopping cart conversion are discussed. For example, a method can include identifying a user interacting with a networked system, accessing user profile data associated with the user, tracking user activity associated with the user, accessing a behavioral model, applying the behavioral model, and determining a shopping cart optimization. The behavioral model can be generated from historical data detailing interactions with the networked system. The behavioral model can be applied to the user profiled data and the user activity data to assist in selection of a shopping cart optimization.
Abstract:
Hierarchical branching deep convolutional neural networks (HD-CNNs) improve existing convolutional neural network (CNN) technology. In a HD-CNN, classes that can be easily distinguished are classified in a higher layer coarse category CNN, while the most difficult classifications are done on lower layer fine category CNNs. Multinomial logistic loss and a novel temporal sparsity penalty may be used in HD-CNN training. The use of multinomial logistic loss and a temporal sparsity penalty causes each branching component to deal with distinct subsets of categories.
Abstract:
Support vector machines (SVMs), though accurate, are not preferred in applications requiring great classification speed, due to the number of support vectors being large. To overcome this problem a primal system and method with the following properties has been devised: (1) it decouples the idea of basis functions from the concept of support vectors; (2) it greedily finds a set of kernel basis functions of a specified maximum size (dmax) to approximate the SVM primal cost function well; (3) it is efficient and roughly scales as O(ndmax2) where n is the number of training examples; and, (4) the number of basis functions it requires to achieve an accuracy close to the SVM accuracy is usually far less than the number of SVM support vectors.
Abstract translation:支持向量机(SVM)虽然准确,但由于支持向量的数量庞大,因此在分类速度很高的应用中并不优选。 为了克服这个问题,已经设计了具有以下性质的原始系统和方法:(1)它将基函数的概念与支持向量的概念相分离; (2)它贪婪地找到一组指定的最大大小(d>最大 SUB>)的内核基函数来很好地近似SVM原始成本函数; (3)它是有效的,并且粗略地作为O(nd max max)来缩放,其中n是训练样本的数量; 和(4)实现接近SVM精度的精度所需的基本函数的数量通常远小于SVM支持向量的数量。
Abstract:
The present invention provides a system and method for building fast and efficient support vector classifiers for large data classification problems which is useful for classifying pages from the World Wide Web and other problems with sparse matrices and large numbers of documents. The method takes advantage of the least squares nature of such problems, employs exact line search in its iterative process and makes use of a conjugate gradient method appropriate to the problem. In one embodiment a support vector classifier useful for classifying a plurality of documents, including textual documents, is built by selecting a plurality of training documents, each training document having suitable numeric attributes which are associated with a training document vector, then initializing a classifier weight vector and a classifier intercept for a classifier boundary, the classifier boundary separating at least two document classes, then determining which training document vectors are suitable support vectors, and then re-computing the classifier weight vector and the classifier intercept for the classifier boundary using the suitable support vectors together with an iteratively reindexed least squares method and a conjugate gradient method with a stopping criterion.