Meeting with some customers, manipulating SQL Server with Stored Procs
I met with some customers today from one company. They took time from their busy schedule to travel to Redmond to spend a couple days to address some SQL Server and Foxpro issues. It’s always a pleasure to meet with customers and learn about their real world needs.
Apparently they have a VFP application that their clients love. They switched their back end data to use SQL Server, and their clients immediately reported a performance decrease.
When they start their Fox app, the first form takes many seconds to show. We discussed various possibilities for the slowdown, and came up with some reasonable things to try to find the bottleneck. Upon startup, the VFP form was retrieving 62,000 records from SQL Server using a query with various parameters. When the data was in VFP native tables, the startup time was unnoticeable. We talked about how changing the back end data store can introduce or move bottlenecks, and some possible design changes. If the number of records was small, the startup was much faster, indicating that the bottleneck was not the creation of the VFP form.
We also discussed using SQL Server Stored Procedures vs remote views vs SQL Pass Through, and the various VFP Remote View and Connection properties, which could help speed up their form.
I mentioned that you can create SQL Stored procedures dynamically via VFP code, and that SPs can be made on a temporary basis: create one just for a few seconds.
Another topic was propagating schema updates. If the customer’s client requests the addition of a new field, or the changing of a field type, then the remote view definitions and the SQL Server tables need to be changed.
Over a dozen years ago, I faced similar issues while an independent consultant in Honolulu. The tables needed to be changed, meaning that the associated code needed to be updated. To solve this, I stored the table & field names and types into a single schema table. When the application starts up, the current environment is compared with the desired schema table. If there are no tables, they are created. If there are schema mismatches, the table structures can be altered. This process is fairly quick and can be done every time the application starts. This ensures no mismatches between code and data schema.
With SQL Server tables added to the mix, the same method can be used. To make it faster, the schema table can be uploaded to SQL server and an SP can be created to update the
environment.
The example below is modeled on the sample in SQL Server interpreter sample. It adds creating a stored procedure with a parameter, invoking that SP, removing that SP, dynamically adding a column
CLEAR ALL
CLEAR
SQLDISCONNECT(0) && close All connections
*cstr="driver=sql server;server=(local)\sqlexpress"
cstr="driver=sql server;server=(local)"
nh=SQLSTRINGCONNECT(cstr)
?"Handle =",nh
IF nh<0
?AERROR(aa)
LIST MEMORY LIKE aa
RETURN
ENDIF
IF SqlDoIt(nh,"use test")<0 && If we can't use the test database (alt: use sp_helpdb)
SqlDoIt(nh,"Create database test")
SqlDoIt(nh,"use test")
ENDIF
SET TEXTMERGE ON TO memvar myvar NOSHOW && put the following text into a variable called myvar
SQLTABLES(nh) && get a table of tables
SELECT * FROM sqlresult WHERE table_type="TABLE" AND table_name = "cust" INTO CURSOR foo
IF _tally>0
\DROP TABLE cust
ENDIF
\CREATE TABLE cust (name char(10),amount tinyint,tstamp datetime,myxml ntext)
FOR i = 1 TO 10
\INSERT INTO cust (name,amount,tstamp,myxml) VALUES ('test<<TRANSFORM(i)>>',
\\<<TRANSFORM(i)>>,'<<DATETIME()-i*86400>>',
\\'<MyTag MyAttr="val<<i>>">Mydata</MyTag>')
ENDFOR
\alter table cust add newcol char(10)
\update cust set newcol='foo'
\select * from cust
\!list off name,amount,tstamp,myxml
* \drop procedure mySProc
\create procedure mySProc @parm1 char(10) as select * from cust where name='test8' or name=@parm1
\!?"Exec my sproc"
\mySProc @parm1='test2'
\!list off name,newcol,amount,tstamp,myxml
\drop procedure mySProc
*!* \sp_tables
* \!brow last
*use master database, so test is not used anymore
\use master
\drop database test
\sp_helpdb
\!list PADR(name,20),db_size
SET TEXTMERGE to
n=ALINES(aa,myvar) && put the lines into an array
FOR i = 1 TO n && for each line
?"Exec line"+TRANSFORM(i),aa[i]
IF SqlDoIt(nh,aa[i])<0 && Execute it
?AERROR(aErrs)
LIST MEMORY LIKE aErrs
EXIT
ENDIF
ENDFOR
SQLDISCONNECT(0) && close All connections
RETURN
PROCEDURE SqlDoIt(nH as Integer, cCmd as String)
nRetval=0
IF LEFT(cCmd,1)='!' && use "!" for Fox commands
cCmd=SUBSTR(cCmd,2)
&cCmd && execute the command
ELSE
nRetval= SQLEXEC(nH,cCmd)
ENDIF
RETURN nRetval
Comments
- Anonymous
May 22, 2006
In reality is not the same performance to work with native tables that with remote tables as SQLServer, in spite of the fact that the 2005 has the "SQL Native Client" the one that is faster than the previous one ODBC. Nevertheless I believe that the CursorAdapter should be faster in the moment to put the resultant data of sqls. in its native boards. In truth the plans of the CA are annoying for when changes the structure of a table for example c(20) to c(50), I believe that an option should be put or to add a tool that permit to bring up to date all the plans of the cursoradapters of the aplicacón, this will permit to reduce the time of maintenance.