其实不是客户要求的,只是我自己想研究一下,如何能在尽可能的情况下,加快大数据量的访问速度,我昨天自己试了一下,我有一个库是1083166条数据,我先把sql语句放在存储过程中,然后直接访问数据,大约花了10多分钟左右,读取过程中相当于死机,读取完了也要从新启动我的笔记本了,资源都被浪费掉了,我换了一种方法,用秒表测试大约3分钟,而且读取过程中还能进行其他操作,读取完成资源浪费也很严重,但起码还能使,现将代码贴出
<%@ Import NameSpace="System.Data" %>
<%@Import Namespace="System.Data.SqlClient" %>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN" "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">
<script runat="server" language="vb">
Dim pageCount As Int32 '数据库的总数据量
Dim pageSize As Int16 '总的页数
Dim conStore As SqlConnection
Dim cmdStore As SqlCommand
Dim dtr As SqlDataReader
Sub Page_Load(ByVal Sender As Object, ByVal e As EventArgs)
Dim conStore As SqlConnection
Dim cmdStore As SqlCommand
Dim dtr As SqlDataReader
Dim setStore As SqlDataAdapter
Dim func1 As Int32 '参数1
Dim func2 As Int32 '参数2
Dim i As Int32 '循环参数
If Not IsPostBack Then
Page()
End If
If pageCount Mod 1000 > 0 Then
pageSize = pageCount / 1000 + 1
End If
conStore = New SqlConnection("server=localhost;UID=sa;PWD=19821113;database=lyjdb")
Function Page()
conStore = New SqlConnection("server=localhost;UID=sa;PWD=19821113;database=lyjdb")
cmdStore = New SqlCommand("Select Count(Auto_id) as Page From Capaccity", conStore)
conStore.Open()
dtr = cmdStore.ExecuteReader()
If dtr.Read Then
pageCount = dtr("Page")
End If
dtr.Close()
conStore.Close()
Return pageCount
End Function
</script>
存储过程的代码:
CREATE PROCEDURE Tablelist @List1 int,@List2 int
As
SELECT auto_id,number,pro_num,size,company,in_price,out_price,entry,out,backoff,backagren,capaccity,capaccity_price
FROM Capaccity C
WHERE C.Auto_id BETWEEN @List1 and @List2
GO