Talk:Basic Linear Algebra Subprograms

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

[Untitled][edit]

Should the reference to uBLAS as a BLAS be given a caveat, seeing as it isn't a BLAS implementation but a C++ container class template system?



How about a note for IBM's BLAS tuned for the Cell Processor? Is this important? http://www-03.ibm.com/technology/cell/swlib.html


I created a section with a new heading: Other libraries with Blas functionality. I hope I sorted all the libraries correctly into the two categories.... 16 Feb. 2010. KJ —Preceding unsigned comment added by Kristjan.Jonasson (talkcontribs) 01:05, 16 February 2010 (UTC)[reply]

scalar dot products?[edit]

I thought it is either called a scalar product or a dot product; scalar dot product sounds ridiculous. —Preceding unsigned comment added by 79.235.154.60 (talk) 07:41, 15 July 2010 (UTC)[reply]

MATTLAS BLAS[edit]

I've twice reverted the introduction of a link to MATTLAS BLAS.[1][2] (I was wrong about the age of the project.) The text is:

MATTLAS BLAS
A modern task based LGPL BLAS implementation written in C++, currently supporting the AVX instruction set for x86_64. [3]

Following the link to github.com, the project README states:

MATTLAS (Matt's Linear Algebra Subroutines) is a high performance BLAS implementation. I am using this BLAS (Basic Linear Algebra Subroutines) implementation primarily for research purposes, however, I intend to produce something of high quality that should be competitive with enterprise vendor solutions. MATTLAS is licensed under the GNU Lesser General Public License, version 3. This is a very early alpha, so please submit bugs when you find them. Also, the library currently only supports AVX and x64 on linux.

By its own statements, this implementation is a personal project in "very early alpha". I don't see anything that suggests stature or reliable sources. Consequently, I believe the mention is WP:UNDUE.

The other BLAS implementations are well-known efforts (e.g., MKL and Goto) or at least university efforts. For example, there was a recent addition of BLIS; that project has technical reports (the second TR has many coauthors) and claims funding by Microsoft and NSF. Although the technical reports are not secondary sources, both TRs claim they have been submitted to journals or conferences.

The MATTLAS BLAS doesn't yet belong in an encyclopedia. I'd revert it again, but I'm at my basic revlimit. Glrx (talk) 23:54, 4 October 2013 (UTC)[reply]

Actually, there's now a very reliable source that MATTLAS BLAS is an enterprise-quality implementation. Sarcasm mode off: I've reverted it. This is clearly a one man project with no mention beyond the guy's various webpages. It has no forks on GitHub, no issues or discussion there or on Launchpad, and it isn't even mentioned in any of his publications. QVVERTYVS (hm?) 22:24, 21 October 2013 (UTC)[reply]

Proposed merge with AXPY[edit]

Articles about single BLAS functions will likely never rise above the level of WP:HOWTO; this one certainly hasn't. QVVERTYVS (hm?) 13:39, 20 October 2013 (UTC)[reply]

  • Support. Covering many subprograms such as AXPY also runs into the problem of creating a reference manual. It is reasonable to have some examples show how different types are handled. Glrx (talk) 22:07, 21 October 2013 (UTC)[reply]

Proposed merge with General Matrix Multiply[edit]

Same reasoning as with the former article AXPY, except this one actually some interesting content that does not violate WP:HOWTO. This content should probably be split between the BLAS article and matrix multiplication (unless there's a specific page on matrix product algorithms?). QVVERTYVS (hm?) 21:37, 22 October 2013 (UTC)[reply]

Not an API?[edit]

Glrx removed my remark that BLAS is a de facto standard API for linear algebra. However, I do have the idea that it is: there are many different implementations of BLAS, some based on the reference implementation, some not, all sharing (almost) the same calling sequence and output semantics. Doesn't that constitute an API? (Or maybe two APIs, Fortran BLAS and CBLAS?) QVVERTYVS (hm?) 12:55, 11 December 2013 (UTC)[reply]

My edit corrected the use of "function", left the "de facto standard" claim, but removed the term API. BLAS is an interface, and it is a programming interface, but it is not (possibly save the later sparse matrix construction/access routines) intended as an applications programming interface. The 1997 BLAS Quick Reference Guide does not mention API. The 2000 BLAST Forum Standard refers to BLAS as "a specification of a set of kernel routines for linear algebra"; it does not use API even though API was a common term in 2000.
The focus of BLAS is not the applications programmer but rather those programmers who implement numerical libraries for others. LINPACK and LAPACK are APIs; the intention is that applications programmers will directly call the LINPACK and LAPACK interfaces. The text I replaced acknowledges that viewpoint: it claimed that BLAS was "a standard API for linear algebra routines." Routines are not applications, so the statement essentially says BLAS is an API for APIs, a statement that lacks precision.
Google does turn up some hits for BLAS API, but WP is the first hit, and the IBM Software Development Kit for Multicore Acceleration v3.0 Basic Linear Algebra Subprograms Programmer’s Guide and API Reference does not seem to carry much weight (I won't WP:COPYLINK it here). LLL's http://acs.lbl.gov/software/colt/api/cern/colt/matrix/linalg/Blas.html is a hit because BLAS was dumped into an API directory tree for linear algebra; the actual page calls BLAS a set of "High quality 'building block' routines for performing basic vector and matrix operations". Glrx (talk) 17:54, 11 December 2013 (UTC)[reply]
This raises the question what an "application" is. That term can be explained two-fold: either it is a user-facing program, or it's taken relative to BLAS, e.g. Julia/Matlab/NumPy applies BLAS and is therefore an application of it. But if few sources call BLAS an API, and the standard calls it a specification, then I'll use the latter term instead. QVVERTYVS (hm?) 12:13, 16 January 2014 (UTC)[reply]
Application software, http://www.webopedia.com/TERM/A/application.html, ... Glrx (talk) 01:02, 19 January 2014 (UTC)[reply]
If you look at Wikipedia's definition of Application programming interface, then you'll find that the narrow notion of application software hardly features in it. That makes sense, because some APIs (notably OS APIs such as POSIX) are meant for systems software as well as end-user applications. QVVERTYVS (hm?) 11:32, 19 January 2014 (UTC)[reply]
I consider that article poor. Glrx (talk) 22:05, 22 January 2014 (UTC)[reply]
Your definition of API seems different from the one I'm familiar with. The ATLAS FAQ also speaks of a "BLAS API". QVVERTYVS (hm?) 12:34, 25 March 2014 (UTC)[reply]

Reason for beta parameter[edit]

From the section Level 3:

by decomposing one or both of into block matrices, GEMM can be implemented recursively. This is one of the motivations for including the parameter,[dubious ] so the results of previous blocks can be accumulated.

I find this (unsourced) remark dubious. Had there been no β, the recursive algorithm could still be implemented by putting a driver routine around it. Simplified C code:

void gemm(int M, int N, int K, double alpha,
          Matrix A, Matrix B, Matrix C)
{
    recursive_gemm(M, N, K, alpha, A, B, C,
                   1. /* beta! */);
}

QVVERTYVS (hm?) 10:48, 29 January 2015 (UTC)[reply]

You think it would be smarter to implement gemm with the β parameter, but then only expose an interface forcing β = 1? I'm not sure why anyone would make that design decision in a low level library like BLAS. This seems like more evidence for the unsourced remark but a source would be nice. 50.191.22.227 (talk) 18:08, 28 April 2015 (UTC)[reply]
That's not what I'm saying at all. I said that if there had been no such parameter, the divide-and-conquer implementation would have been just as feasible so the reasoning is flawed. QVVERTYVS (hm?) 19:00, 28 April 2015 (UTC)[reply]
I think the section is trying to say something different.
For example, to multiply one large matrix A by another B, divide and conquer. Intermediate problem might be multiplying the first 16 rows of A by the first 16 cols of B to get a 16 by 16 subresult in C. That multiplication might be done by dividing Asub and Bsub into 16 by 16 chunks. I set β to 0 for the first sub-sub-product and then set β to 1 for the subsequent sub-sub-product accumulations. If I don't have β, then I need one routine for the initial case and a different routine for the accumulation case.
Glrx (talk) 17:04, 20 May 2015 (UTC)[reply]
But then it's not GEMM that is implemented recursively, but GEMM being used as a base case for recursive algorithms; this is what Dongarra seems to be suggesting too. So the text is off. QVVERTYVS (hm?) 08:52, 21 May 2015 (UTC)[reply]
No, I'm gave an example of how GEMM could call itself on simpler problems (and why β is needed). I don't know if GEMM is done that way (and given most Fortrans, it probably isn't; D comments about Fortran excluding recursion). The ref you gave shows that GEMM does have matrix update applications in its own right. Ref 19 could be an appropriate source for recursive calls. Instead of explicit recursion, I would expect GEMM to break the problem into reasonably-sized subblocks and then iterate them with a base-case GEMM. Glrx (talk) 19:21, 21 May 2015 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified 7 external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 01:08, 28 October 2016 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified 5 external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 13:49, 15 July 2017 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified 2 external links on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 09:13, 9 September 2017 (UTC)[reply]

External links modified[edit]

Hello fellow Wikipedians,

I have just modified one external link on Basic Linear Algebra Subprograms. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 11:28, 14 September 2017 (UTC)[reply]

Which applications?[edit]

This article would be more useful if it described the most notable applications for BLAS. I know that matrix math is used in a wide range of applications from neural networks to the design of nuclear bombs, but are there particular applications or classes of applications that tend to use BLAS specifically rather than other library interface specifications? 67.188.1.213 (talk) 20:42, 3 September 2021 (UTC)[reply]