thanks app_engine,Quote:
Originally Posted by app_engine
I will try that :)
Printable View
thanks app_engine,Quote:
Originally Posted by app_engine
I will try that :)
Nice to see this thread..
Any Mainframe techies here?? Just to know.. futurela thevai patta helpukku inga varalamla adhukku :wink:
VithaganQuote:
Originally Posted by vithagan
people still use mainframes :shock: :roll:
:yes: Absolutely.. many financial institutions still run on Mainframes. :)Quote:
Originally Posted by PARAMASHIVAN
Naanum latest technologiesku maaranumnu romba try panren.. mudiyala :(
Good thread :clap:
Vithagan,Quote:
Originally Posted by vithagan
Yean kavala paduringa. Mainframes lam azhiyavae azhiyaathu :thumbsup:
:cool2:Quote:
Originally Posted by "VinodKumar's
Paramashivan: There is a company still manufacturing mainframes ! :lol:Quote:
Originally Posted by PARAMASHIVAN
You mean IBM :lol2:Quote:
Originally Posted by rajraj
'hope this is the right thread to post this- anyone who could gimme info. on TECHNICAL WRITER jobs?
This is not a recruitment agency :rotfl: Google it Shya'm :yes:Quote:
Originally Posted by Lambretta
:ty: so much for the help, Raghu! :roll: :sigh2:Quote:
Originally Posted by PARAMASHIVAN
Evolo udhaivi senjenga! :bow: :evil:
Lambretta: Contact Badri ( Moderator) who is a technical writer in Australia ! :)Quote:
Originally Posted by Lambretta
Yes he is a technical author of cricinfo.com :lol2:Quote:
Originally Posted by rajraj
Has any one worked 'Extensively' on SQL server 2005 / 2008 cursors?
S to some extent.It has to be used only if its really necessary.Quote:
Originally Posted by PARAMASHIVAN
Kirukan
Thanks KirukanQuote:
Originally Posted by kirukan
just saw your post now, I will post my query here soon. :)
Hi all
I have a question about 'Duplicates' in SQL server database tables. Consider Table_A, with columns called Col_A, Col_B, Col_c
now if I run this query, will it only find the 'Duplicate values' in the 'Column' fields rather than 'Duplicate rows' in the table?
Select Col_A from Table_A
group by Col_A
having count(*) > 1
For duplicate rows, you need to group by all columns.Quote:
Originally Posted by PARAMASHIVAN
Oh Thanks App :)
app_engine
Small problem, I can not 'Group by' any columns that are of Ntext, Image data types, Do I need to convert them all into 'Int' data type ?? :roll:
I haven't used group by on image fields but it works for character and numeric fields. If this is for a browser app, I suggest you look up to such communities who may have a quick-fix:-)
thanks app_engine na
Just saw your postQuote:
Originally Posted by PARAMASHIVAN
You can try this
--nText
SELECT
Convert(varchar(8000),nTextCol ),
MAX(Amount)
FROM
TableA
GROUP BY
Convert(varchar(8000),nTextCol )
--Image
SELECT
Convert(varbinary(8),ImageCol ),
MAX(Amount)
FROM
TableA
GROUP BY
Convert(varbinary(8),ImageCol )
This may help you
Kirukan
You have not included the 'Having' clause :roll:
I thought the 'having (*) > 1' clause is the 'Only' way to find duplicates ? :roll:
My suggestion was for adding image and ntext in group by clause...Quote:
Originally Posted by PARAMASHIVAN
Oh thanks :)
Does any one here know much about SSRS in SQL server 2005 ?? esp adding parameters to a report?
thanks
Parama .... U could use Report Builder 2.0 which is supposedly user friendly for designing reports
Is it a free download ? :roll: never used it before, this SSRS is so irritating, Stored procedures are 'much better' to query the databse !Quote:
Originally Posted by bingleguy
I guess Ys ....
Has any one used 'Core FTP ' software? Is it any good ? :roll:
Some one pls help. See the query below. when I run it I want all the first name/ last name grouped together, I have included them in the group by aggregate, but the in the out put it does not list first_name, Last_name grouped together
USE hospices
SELECT
dbo._contact.reference,
dbo._contact.first_name,
dbo._contact.last_name,
don2.value_gross AS ValueGross,
don2.value_net AS ValueNet,
'Individual' AS donorType
FROM dbo._Donation don2
INNER Join dbo._Contact ON don2.supporter_id = dbo._Contact.id
WHERE don2.legacy_id is null
and dbo._Contact.dutchess_norfolk_contact=1
and dbo._Contact.deceased=0
and don2.value_gross >=500
group by
dbo._contact.reference,
dbo._contact.first_name,
dbo._contact.last_name,
don2.value_gross,
don2.value_net
ORDER BY ValueGross DESC
Hey, you're grouping by all the fields that you're selecting.
What value then you want to see aggregated? Is it count(*) or something else?
Your query will only get a hardcoded value "Individual" for each entry of reference, firstname, lastname, valuegross and valuenet.
If you want sum of, say value gross or value net then you need to only group by ref / first name / last name and place the other inside the desired function.
Thanks na, I orderd by last_name and it works.
:P
many thanks for 'prompt' reply :notworthy:
use hospices_test
select contact_id,event_id
from _event_response
group by contact_id,event_id
having count(*)> 1
The above statement returned all the duplicates, I have to delete all of them. I tried subquery , it does not work, as it can not return more than one value.
Is there any other way I can do this, Please help, this is quite urgent
Params,
I've not worked recently on Microsoft SQL-Server and you may possibly want to try their solution given here :
http://support.microsoft.com/kb/139444
For Oracle, I normally use "rowid" as in this sql below :
DELETE FROM our_table
WHERE rowid not in
(SELECT MIN(rowid)
FROM our_table
GROUP BY column1, column2, column3...) ;
I'm not sure whether MS-SQL uses rowid, give it a try :-)
Hello annehQuote:
Originally Posted by app_engine
I have seen the MSDN website, wasn't any help :( will try your code and let you know.
many thanks for help :)
sorry it does not work , there is no rowid in sql server, it is called row_number and I tried that and it doesn't work :|
no worries thanks any way
This article also talks about using a temp table (in the case of SQL Server) :
http://database-programming.suite101..._in_sql_server
May be there are other ways, but if you have grants to create a new table, that is possibly a quick (and dirty) solution.