Fix supporter import bug

This commit is contained in:
Eric Schultz 2020-09-01 18:16:17 -05:00 committed by Eric Schultz
parent 358b0a2f38
commit 176f88a9c4
13 changed files with 483 additions and 366 deletions

View file

@ -1781,6 +1781,47 @@ OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
------
** bcrypt; version 3.1.13 --
Copyright 2007-2011
Copyright (c) 1998-2010 Solar Designer
Copyright (c) 1998-2014 Solar Designer
Copyright (c) 2000-2002 Solar Designer
Copyright (c) 2000-2011 Solar Designer
Copyright (c) 2000-2014 Solar Designer
Copyright (c) 2006 Damien Miller <djm@mindrot.org>
(The MIT License)
Copyright 2007-2011:
* Coda Hale <coda.hale@gmail.com>
C implementation of the BCrypt algorithm by Solar Designer and placed in the
public domain.
jBCrypt is Copyright (c) 2006 Damien Miller <djm@mindrot.org>.
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
------
** wisper-activejob; version 1.0.0 --
@ -2540,6 +2581,66 @@ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
------
** nio4r; version 2.5.2 --
(c) 2011 Emanuele Giaquinta
Copyright (c) 2011 Tony Arcieri.
Copyright, 2019, by Tony Arcieri.
Copyright (c) 2011-2017 Tony Arcieri.
Copyright, 2007-2019, by Marc Alexander Lehmann.
(c) 2009-2015 Marc Alexander Lehmann <libecb@schmorp.de>
Copyright (c) 2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007-2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009,2010,2011,2012,2013 Marc Alexander Lehmann.
Copyright (c) 2007,2008,2009,2010,2011 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009,2010,2011,2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright, 2019, by Samuel G. D. Williams (http://www.codeotaku.com/samuel-williams).
Copyright (c) 2007,2008,2009,2010,2011,2016,2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009,2010,2011,2012,2013,2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009,2010,2011,2016,2017,2019 Marc Alexander Lehmann <libev@schmorp.de>
Copyright (c) 2007,2008,2009,2010,2011,2012,2013,2016,2019 Marc Alexander Lehmann <libev@schmorp.de>
All files in libev are
Copyright (c)2007,2008,2009,2010,2011,2012,2013 Marc Alexander Lehmann.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
Alternatively, the contents of this package may be used under the terms
of the GNU General Public License ("GPL") version 2 or any later version,
in which case the provisions of the GPL are applicable instead of the
above. If you wish to allow the use of your version of this package only
under the terms of the GPL and not to allow others to use your version of
this file under the BSD license, indicate your decision by deleting the
provisions above and replace them with the notice and other provisions
required by the GPL in this and the other files of this package. If you do
not delete the provisions above, a recipient may use your version of this
file under either the BSD or the GPL.
------
** listen; version 3.2.1 --
@ -4997,47 +5098,6 @@ TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
------
** bcrypt; version 3.1.13 --
Copyright 2007-2011
Copyright (c) 1998-2010 Solar Designer
Copyright (c) 1998-2014 Solar Designer
Copyright (c) 2000-2002 Solar Designer
Copyright (c) 2000-2011 Solar Designer
Copyright (c) 2000-2014 Solar Designer
Copyright (c) 2006 Damien Miller <djm@mindrot.org>
(The MIT License)
Copyright 2007-2011:
* Coda Hale <coda.hale@gmail.com>
C implementation of the BCrypt algorithm by Solar Designer and placed in the
public domain.
jBCrypt is Copyright (c) 2006 Damien Miller <djm@mindrot.org>.
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
'Software'), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
------
** diff-lcs; version 1.3 --

View file

@ -11,14 +11,15 @@ module Nonprofits
# post /nonprofits/:nonprofit_id/imports
def create
render_json do
ImportCreationJob.perform_later(import_params, current_user)
request = ImportRequest.create(import_params)
ImportCreationJob.perform_later(request, current_user)
end
end
private
def import_params
params.permit(:nonprofit_id, :file_uri, :header_matches)
params.permit(:nonprofit_id, :import_file, header_matches: {})
end
end
end

View file

@ -0,0 +1,16 @@
// License: LGPL-3.0-or-later
import { DirectUpload, Blob } from '@rails/activestorage';
export function uploadFile(controllerUrl: string, file: File): Promise<Blob> {
const duPromise = new Promise<Blob>((resolve, reject) => {
// eslint-disable-next-line @typescript-eslint/no-empty-function
const du = new DirectUpload(file, controllerUrl);
du.create((error, result) => {
if (error) { reject(error); }
if (result) { resolve(result); }
});
});
return duPromise;
}

View file

@ -1,41 +1,20 @@
// License: LGPL-3.0-or-later
const flyd = require('flyd')
const R = require('ramda')
// local
const request = require('./super-agent-frp')
const postFormData = require('./post-form-data')
const activestorage = require('../../common/activestorage')
// Pass in a stream of Input Nodes with type file
// Make a post request to our server to start the import
// Will create a backgrounded job and email the user when
// completed
// Returns a stream of {uri: 'uri of uploaded file on s3', formData: 'original form data'}
const uploadFile = R.curry(input => {
// We need to get an AWS presigned post thing to so we can upload files
// Stream of pairs of [formObjData, presignedPostObj]
var withPresignedPost$ = flyd.map(
resp => [input, resp.body]
, request.post('/aws_presigned_posts').perform()
)
// Stream of upload responses from s3
return flyd.flatMap(
pair => {
var [input, presignedPost] = pair
var url = `https://${presignedPost.s3_direct_url.host}`
var file = input.files[0]
var fileUrl = `${url}/tmp/${presignedPost.s3_uuid}/${file.name}`
var urlWithPort = `${url}:${presignedPost.s3_direct_url.port}`
var payload = R.merge(JSON.parse(presignedPost.s3_presigned_post), {file})
return flyd.map(resp => ({uri: fileUrl, file}), postFormData(url, payload))
}
, withPresignedPost$)
})
const uploadFile = (controllerUrl) => {
return R.curry(input => {
const $stream = flyd.stream()
activestorage.uploadFile(controllerUrl, input.files[0]).then((blob) => $stream(blob))
return $stream;
})
}
module.exports = uploadFile

View file

@ -49,7 +49,7 @@ function init() {
state.matchedHeaders$ = flyd.map(findHeaderMatches, headers$)
// state.submitImport$ is passed the current component state, and we just want a stream of input node objects for uploadFile
const uploaded$ = flyd.flatMap(uploadFile, state.submitImport$)
const uploaded$ = flyd.flatMap(uploadFile('/rails/active_storage/direct_uploads'), state.submitImport$)
// The matched headers with a simplified data structure to post to the server
// data structure is like {header_name => match_name} -- eg {'Donation Amount' => 'donation.amount'}
@ -98,11 +98,11 @@ function init() {
// post to /imports after the file is uploaded to S3
const postImport = R.curry((headers, file) => {
const postImport = R.curry((headers, blob) => {
return flyd.map(R.prop('body'), request({
method: 'post'
, path: `/nonprofits/${app.nonprofit_id}/imports`
, send: {file_uri: file.uri, header_matches: headers}
, send: {import_file: blob.signed_id, header_matches: headers}
}).load)
})

View file

@ -1,13 +1,11 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
class ImportCreationJob < ApplicationJob
queue_as :default
def perform(import_params, current_user)
InsertImport.from_csv_safe(
nonprofit_id: import_params[:nonprofit_id],
user_id: current_user.id,
user_email: current_user.email,
file_uri: import_params[:file_uri],
header_matches: import_params[:header_matches]
)
def perform(import_request, current_user)
import_request.execute_safe(current_user)
end
end

View file

@ -0,0 +1,150 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
class ImportRequest < ApplicationRecord
belongs_to :nonprofit
has_one_attached :import_file
def execute_safe(user)
begin
ImportRequest.transaction do
execute(user)
end
rescue Exception => e
body = "Import failed. Error: #{e}"
GenericMailer.generic_mail(
Houdini.support_email, Houdini.support_email, # FROM
body,
'Import error', # SUBJECT
Houdini.support_email, Houdini.support_email # TO
).deliver
end
end
def execute(user)
import = Import.create(date:Time.current, nonprofit:nonprofit, user: user)
row_count = 0
imported_count = 0
supporter_ids = []
created_payment_ids = []
import_file_blob.open do |file|
CSV.new(file, headers: :first_row).each do |row|
row_count += 1
# triplet of [header_name, value, import_key]
matches = row.map { |key, val| [key, val, header_matches[key]] }
next if matches.empty?
table_data = matches.each_with_object({}) do |triplet, acc|
key, val, match = triplet
if match == 'custom_field'
acc['custom_fields'] ||= []
acc['custom_fields'].push([key, val])
elsif match == 'tag'
acc['tags'] ||= []
acc['tags'].push(val)
else
table, col = match.split('.') if match.present?
if table.present? && col.present?
acc[table] ||= {}
acc[table][col] = val
end
end
end
# Create supporter record
if table_data['supporter']
table_data['supporter'] = InsertSupporter.defaults(table_data['supporter'])
table_data['supporter']['imported_at'] = Time.current
table_data['supporter']['import_id'] = import['id']
table_data['supporter']['nonprofit_id'] = nonprofit.id
table_data['supporter'] = Qx.insert_into(:supporters).values(table_data['supporter']).ts.returning('*').execute.first
supporter_ids.push(table_data['supporter']['id'])
imported_count += 1
else
table_data['supporter'] = {}
end
# Create custom fields
if table_data['supporter']['id'] && table_data['custom_fields'] && table_data['custom_fields'].any?
InsertCustomFieldJoins.find_or_create(nonprofit.id, [table_data['supporter']['id']], table_data['custom_fields'])
end
# Create new tags
if table_data['supporter']['id'] && table_data['tags'] && table_data['tags'].any?
# Split tags by semicolons
tags = table_data['tags'].select(&:present?).map { |t| t.split(/[;,]/).map(&:strip) }.flatten
InsertTagJoins.find_or_create(nonprofit.id, [table_data['supporter']['id']], tags)
end
# Create donation record
if table_data['donation'] && table_data['donation']['amount'] # must have amount. donation.date without donation.amount is no good
table_data['donation']['amount'] = (table_data['donation']['amount'].gsub(/[^\d\.]/, '').to_f * 100).to_i
table_data['donation']['supporter_id'] = table_data['supporter']['id']
table_data['donation']['nonprofit_id'] = nonprofit.id
table_data['donation']['date'] = Chronic.parse(table_data['donation']['date']) if table_data['donation']['date'].present?
table_data['donation']['date'] ||= Time.current
table_data['donation'] = Qx.insert_into(:donations).values(table_data['donation']).ts.returning('*').execute.first
imported_count += 1
else
table_data['donation'] = {}
end
# Create payment record
if table_data['donation'] && table_data['donation']['id']
table_data['payment'] = Qx.insert_into(:payments).values(
gross_amount: table_data['donation']['amount'],
fee_total: 0,
net_amount: table_data['donation']['amount'],
kind: 'OffsitePayment',
nonprofit_id: nonprofit.id,
supporter_id: table_data['supporter']['id'],
donation_id: table_data['donation']['id'],
towards: table_data['donation']['designation'],
date: table_data['donation']['date']
).ts.returning('*')
.execute.first
imported_count += 1
else
table_data['payment'] = {}
end
# Create offsite payment record
if table_data['donation'] && table_data['donation']['id']
table_data['offsite_payment'] = Qx.insert_into(:offsite_payments).values(
gross_amount: table_data['donation']['amount'],
check_number: GetData.chain(table_data['offsite_payment'], 'check_number'),
kind: table_data['offsite_payment'] && table_data['offsite_payment']['check_number'] ? 'check' : '',
nonprofit_id: nonprofit.id,
supporter_id: table_data['supporter']['id'],
donation_id: table_data['donation']['id'],
payment_id: table_data['payment']['id'],
date: table_data['donation']['date']
).ts.returning('*')
.execute.first
imported_count += 1
else
table_data['offsite_payment'] = {}
end
created_payment_ids.push(table_data['payment']['id']) if table_data['payment'] && table_data['payment']['id']
end
end
# Create donation activity records
InsertActivities.for_offsite_donations(created_payment_ids) if created_payment_ids.count > 0
import.row_count = row_count
import.imported_count = imported_count
import.save!
Supporter.where("supporters.id IN (?)", supporter_ids).each do |s|
Houdini.event_publisher.announce(:supporter_create, s)
end
ImportCompletedJob.perform_later(import)
destroy
import
end
end

View file

@ -0,0 +1,11 @@
class CreateImportRequests < ActiveRecord::Migration[6.0]
def change
create_table :import_requests do |t|
t.jsonb :header_matches
t.references :nonprofit
t.string :user_email
t.timestamps
end
end
end

View file

@ -1306,6 +1306,39 @@ CREATE SEQUENCE public.image_attachments_id_seq
ALTER SEQUENCE public.image_attachments_id_seq OWNED BY public.image_attachments.id;
--
-- Name: import_requests; Type: TABLE; Schema: public; Owner: -
--
CREATE TABLE public.import_requests (
id bigint NOT NULL,
header_matches jsonb,
nonprofit_id bigint,
user_email character varying,
created_at timestamp(6) without time zone NOT NULL,
updated_at timestamp(6) without time zone NOT NULL
);
--
-- Name: import_requests_id_seq; Type: SEQUENCE; Schema: public; Owner: -
--
CREATE SEQUENCE public.import_requests_id_seq
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
--
-- Name: import_requests_id_seq; Type: SEQUENCE OWNED BY; Schema: public; Owner: -
--
ALTER SEQUENCE public.import_requests_id_seq OWNED BY public.import_requests.id;
--
-- Name: imports; Type: TABLE; Schema: public; Owner: -
--
@ -2528,6 +2561,13 @@ ALTER TABLE ONLY public.full_contact_topics ALTER COLUMN id SET DEFAULT nextval(
ALTER TABLE ONLY public.image_attachments ALTER COLUMN id SET DEFAULT nextval('public.image_attachments_id_seq'::regclass);
--
-- Name: import_requests id; Type: DEFAULT; Schema: public; Owner: -
--
ALTER TABLE ONLY public.import_requests ALTER COLUMN id SET DEFAULT nextval('public.import_requests_id_seq'::regclass);
--
-- Name: imports id; Type: DEFAULT; Schema: public; Owner: -
--
@ -2944,6 +2984,14 @@ ALTER TABLE ONLY public.image_attachments
ADD CONSTRAINT image_attachments_pkey PRIMARY KEY (id);
--
-- Name: import_requests import_requests_pkey; Type: CONSTRAINT; Schema: public; Owner: -
--
ALTER TABLE ONLY public.import_requests
ADD CONSTRAINT import_requests_pkey PRIMARY KEY (id);
--
-- Name: imports imports_pkey; Type: CONSTRAINT; Schema: public; Owner: -
--
@ -3291,6 +3339,13 @@ CREATE INDEX index_exports_on_nonprofit_id ON public.exports USING btree (nonpro
CREATE INDEX index_exports_on_user_id ON public.exports USING btree (user_id);
--
-- Name: index_import_requests_on_nonprofit_id; Type: INDEX; Schema: public; Owner: -
--
CREATE INDEX index_import_requests_on_nonprofit_id ON public.import_requests USING btree (nonprofit_id);
--
-- Name: index_payments_on_created_at; Type: INDEX; Schema: public; Owner: -
--
@ -4035,6 +4090,7 @@ INSERT INTO "schema_migrations" (version) VALUES
('20181129205652'),
('20181129224030'),
('20191105200033'),
('20200423222447');
('20200423222447'),
('20200901214156');

View file

@ -1,172 +0,0 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
require 'qx'
require 'required_keys'
require 'open-uri'
require 'csv'
require 'insert/insert_supporter'
require 'insert/insert_custom_field_joins'
require 'insert/insert_tag_joins'
module InsertImport
# Wrap the import in a transaction and email any errors
def self.from_csv_safe(data)
Qx.transaction do
InsertImport.from_csv(data)
end
rescue Exception => e
body = "Import failed. Error: #{e}"
GenericMailer.generic_mail(
'support@commitchange.com', 'Jay Bot', # FROM
body,
'Import error', # SUBJECT
'support@commitchange.com', 'Jay' # TO
).deliver
end
# Insert a bunch of Supporter and related data using a CSV and a bunch of header_matches
# See also supporters/import/index.es6 for the front-end piece that generates header_matches
# This is a slow function; it is to be delayed-jobbed
# data: nonprofit_id, user_email, user_id, file, header_matches
# Will send a notification email to user_email when the import is completed
def self.from_csv(data)
ParamValidation.new(data,
file_uri: { required: true },
header_matches: { required: true },
nonprofit_id: { required: true, is_integer: true },
user_email: { required: true })
import = Qx.insert_into(:imports)
.values(
date: Time.current,
nonprofit_id: data[:nonprofit_id],
user_id: data[:user_id]
)
.timestamps
.returning('*')
.execute.first
row_count = 0
imported_count = 0
supporter_ids = []
created_payment_ids = []
# no spaces are allowed by open(). We could URI.encode, but spaces seem to be the only problem and we want to avoid double-encoding a URL
data[:file_uri] = data[:file_uri].gsub(/ /, '%20')
CSV.new(open(data[:file_uri]), headers: :first_row).each do |row|
row_count += 1
# triplet of [header_name, value, import_key]
matches = row.map { |key, val| [key, val, data[:header_matches][key]] }
next if matches.empty?
table_data = matches.each_with_object({}) do |triplet, acc|
key, val, match = triplet
if match == 'custom_field'
acc['custom_fields'] ||= []
acc['custom_fields'].push([key, val])
elsif match == 'tag'
acc['tags'] ||= []
acc['tags'].push(val)
else
table, col = match.split('.') if match.present?
if table.present? && col.present?
acc[table] ||= {}
acc[table][col] = val
end
end
end
# Create supporter record
if table_data['supporter']
table_data['supporter'] = InsertSupporter.defaults(table_data['supporter'])
table_data['supporter']['imported_at'] = Time.current
table_data['supporter']['import_id'] = import['id']
table_data['supporter']['nonprofit_id'] = data[:nonprofit_id]
table_data['supporter'] = Qx.insert_into(:supporters).values(table_data['supporter']).ts.returning('*').execute.first
supporter_ids.push(table_data['supporter']['id'])
imported_count += 1
else
table_data['supporter'] = {}
end
# Create custom fields
if table_data['supporter']['id'] && table_data['custom_fields'] && table_data['custom_fields'].any?
InsertCustomFieldJoins.find_or_create(data[:nonprofit_id], [table_data['supporter']['id']], table_data['custom_fields'])
end
# Create new tags
if table_data['supporter']['id'] && table_data['tags'] && table_data['tags'].any?
# Split tags by semicolons
tags = table_data['tags'].select(&:present?).map { |t| t.split(/[;,]/).map(&:strip) }.flatten
InsertTagJoins.find_or_create(data[:nonprofit_id], [table_data['supporter']['id']], tags)
end
# Create donation record
if table_data['donation'] && table_data['donation']['amount'] # must have amount. donation.date without donation.amount is no good
table_data['donation']['amount'] = (table_data['donation']['amount'].gsub(/[^\d\.]/, '').to_f * 100).to_i
table_data['donation']['supporter_id'] = table_data['supporter']['id']
table_data['donation']['nonprofit_id'] = data[:nonprofit_id]
table_data['donation']['date'] = Chronic.parse(table_data['donation']['date']) if table_data['donation']['date'].present?
table_data['donation']['date'] ||= Time.current
table_data['donation'] = Qx.insert_into(:donations).values(table_data['donation']).ts.returning('*').execute.first
imported_count += 1
else
table_data['donation'] = {}
end
# Create payment record
if table_data['donation'] && table_data['donation']['id']
table_data['payment'] = Qx.insert_into(:payments).values(
gross_amount: table_data['donation']['amount'],
fee_total: 0,
net_amount: table_data['donation']['amount'],
kind: 'OffsitePayment',
nonprofit_id: data[:nonprofit_id],
supporter_id: table_data['supporter']['id'],
donation_id: table_data['donation']['id'],
towards: table_data['donation']['designation'],
date: table_data['donation']['date']
).ts.returning('*')
.execute.first
imported_count += 1
else
table_data['payment'] = {}
end
# Create offsite payment record
if table_data['donation'] && table_data['donation']['id']
table_data['offsite_payment'] = Qx.insert_into(:offsite_payments).values(
gross_amount: table_data['donation']['amount'],
check_number: GetData.chain(table_data['offsite_payment'], 'check_number'),
kind: table_data['offsite_payment'] && table_data['offsite_payment']['check_number'] ? 'check' : '',
nonprofit_id: data[:nonprofit_id],
supporter_id: table_data['supporter']['id'],
donation_id: table_data['donation']['id'],
payment_id: table_data['payment']['id'],
date: table_data['donation']['date']
).ts.returning('*')
.execute.first
imported_count += 1
else
table_data['offsite_payment'] = {}
end
created_payment_ids.push(table_data['payment']['id']) if table_data['payment'] && table_data['payment']['id']
end
# Create donation activity records
InsertActivities.for_offsite_donations(created_payment_ids) if created_payment_ids.count > 0
import = Qx.update(:imports)
.set(row_count: row_count, imported_count: imported_count)
.where(id: import['id'])
.returning('*')
.execute.first
Supporter.where("supporter.ids IN (?)", supporter_ids).each do |s|
Houdini.event_publisher.announce(:supporter_create, s)
end
ImportCompletedJob.perform_later(Import.find(import['id']))
import
end
end

View file

@ -0,0 +1,11 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
FactoryBot.define do
factory :import_request do
header_matches { "" }
nonprofit { "" }
user_email { "MyString" }
end
end

View file

@ -1,110 +0,0 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
require 'rails_helper'
describe InsertImport, pending: true do
before(:all) do
# @data = PsqlFixtures.init
end
describe '.from_csv' do
before(:all) do
# @row_count = 4
# @args = {
# nonprofit_id: @data['np']['id'],
# user_email: @data['np_admin']['email'],
# user_id: @data['np_admin']['id'],
# file_uri: "#{ENV['PWD']}/spec/fixtures/test_import.csv",
# header_matches: {
# "Date" => "donation.date",
# "Program" => "donation.designation",
# "Amount" => "donation.amount",
# "Business or organization name" => "supporter.organization",
# "First Name" => "supporter.first_name",
# "Last Name" => "supporter.last_name",
# "Address" => "supporter.address",
# "City" => "supporter.city",
# "State" => "supporter.state_code",
# "Zip Code" => "supporter.zip_code",
# "EMAIL" => "supporter.email",
# "notes" => "donation.comment",
# "Field Guy" => "custom_field",
# "Tag 1" => "tag",
# "Tag 2" => "tag"
# }
# }
# @result = InsertImport.from_csv(@args)
# @supporters = Psql.execute("SELECT * FROM supporters WHERE import_id = #{@result['id']}")
# @supporter_ids = @supporters.map{|h| h['id']}
# @donations = Psql.execute("SELECT * FROM donations WHERE supporter_id IN (#{@supporter_ids.join(",")})")
end
it 'creates an import table with all the correct data' do
expect(@result['nonprofit_id']).to eq(@data['np']['id'])
expect(@result['id']).to be_present
expect(@result['row_count']).to eq @row_count
expect(@result['date']).to eq(@result['created_at'])
expect(@result['user_id']).to eq(@data['np_admin']['id'])
expect(@result['imported_count']).to eq(16)
end
it 'creates all the supporters with correct names' do
names = @supporters.map { |s| s['name'] }
expect(names.sort).to eq(Hamster::Vector['Robert Norris', 'Angie Vaughn', 'Bill Waddell', 'Bubba Thurmond'].sort)
end
it 'creates all the supporters with correct emails' do
emails = @supporters.map { |s| s['email'] }
expect(emails.sort).to eq(Hamster::Vector['user@example.com', 'user@example.com', 'user@example.com', 'user@example.com'].sort)
end
it 'creates all the supporters with correct organizations' do
orgs = @supporters.map { |s| s['organization'] }
expect(orgs.sort).to eq(Hamster::Vector['Jet-Pep', 'Klein Drug Shoppe, Inc.', 'River City Equipment Rental and Sales', 'Somewhere LLC'].sort)
end
it 'creates all the supporters with correct cities' do
cities = @supporters.map { |s| s['city'] }
expect(cities.sort).to eq(Hamster::Vector['Decatur', 'Guntersville', 'Holly Pond', 'Snead'].sort)
end
it 'creates all the supporters with correct addresses' do
addresses = @supporters.map { |s| s['address'] }
expect(addresses.sort).to eq(Hamster::Vector['3370 Alabama Highway 69', '649 Finley Island Road', 'P.O. Box 143', 'P.O. Box 611'].sort)
end
it 'creates all the supporters with correct zip_codes' do
zips = @supporters.map { |s| s['zip_code'] }
expect(zips.sort).to eq(Hamster::Vector['35601', '35806', '35952', '35976'].sort)
end
it 'creates all the supporters with correct state_codes' do
states = @supporters.map { |s| s['state_code'] }
expect(states.sort).to eq(Hamster::Vector['AL', 'AL', 'AL', 'AL'])
end
it 'creates all the donations with correct amounts' do
amounts = @donations.map { |d| d['amount'] }
expect(amounts.sort).to eq(Hamster::Vector[1000, 1000, 1000, 1000])
end
it 'creates all the donations with correct designations' do
desigs = @donations.map { |d| d['designation'] }
expect(desigs.sort).to eq(Hamster::Vector['third party event', 'third party event', 'third party event', 'third party event'])
end
it 'inserts custom fields' do
vals = Psql.execute('SELECT value FROM custom_field_joins ORDER BY id DESC LIMIT 4').map { |h| h['value'] }
expect(vals).to eq(Hamster::Vector['custfield', 'custfield', 'custfield', 'custfield'])
end
it 'inserts tags' do
ids = @supporters.map { |h| h['id'] }.join(', ')
names = Psql.execute("SELECT tag_masters.name FROM tag_joins JOIN tag_masters ON tag_masters.id=tag_joins.tag_master_id WHERE tag_joins.supporter_id IN (#{ids})")
.map { |h| h['name'] }
expect(Hamster.to_ruby(names).sort).to eq(%w[tag1 tag1 tag1 tag1 tag2 tag2 tag2 tag2])
end
end
end

View file

@ -0,0 +1,117 @@
# frozen_string_literal: true
# License: AGPL-3.0-or-later WITH WTO-AP-3.0-or-later
# Full license explanation at https://github.com/houdiniproject/houdini/blob/master/LICENSE
require 'rails_helper'
RSpec.describe ImportRequest, type: :model do
let(:import_path){ "spec/fixtures/test_import.csv"}
let(:import_filename) { 'test_import.csv'}
let(:row_count) { 4}
let(:nonprofit) { create(:nm_justice)}
let(:user) {force_create(:user)}
let(:user_email) { user.email}
let(:header_matches) { {
"Date" => "donation.date",
"Program" => "donation.designation",
"Amount" => "donation.amount",
"Business or organization name" => "supporter.organization",
"First Name" => "supporter.first_name",
"Last Name" => "supporter.last_name",
"Address" => "supporter.address",
"City" => "supporter.city",
"State" => "supporter.state_code",
"Zip Code" => "supporter.zip_code",
"EMAIL" => "supporter.email",
"notes" => "donation.comment",
"Field Guy" => "custom_field",
"Tag 1" => "tag",
"Tag 2" => "tag"
}}
describe 'successful' do
around(:each) do |example|
Timecop.freeze(2020, 5,5) do
example.run
end
end
let(:request) {
ir = ImportRequest.new(nonprofit:nonprofit, header_matches: header_matches, user_email:user_email)
ir.import_file.attach(io: File.open(import_path), filename:import_filename)
ir.save!
ir
}
let!(:import) { request.execute(user)}
let(:donations) { Supporter.all.map{|i| i.donations}.flatten}
it 'creates an Import with all the correct data' do
expect(import.nonprofit).to eq(nonprofit)
expect(import.id).to be_present
expect(import.row_count).to eq row_count
expect(import.date).to eq(import.created_at)
expect(import.user_id).to eq(user.id)
expect(import.imported_count).to eq(16)
end
it 'deleted the import request' do
expect(ImportRequest.where(id: request.id).count).to eq 0
end
it 'creates all the supporters with correct names' do
names = Supporter.pluck(:name)
expect(names).to match_array ['Robert Norris', 'Angie Vaughn', 'Bill Waddell', 'Bubba Thurmond']
end
it 'creates all the supporters with correct emails' do
emails = Supporter.pluck(:email)
expect(emails).to match_array(['user@example.com', 'user@example.com', 'user@example.com', 'user@example.com'])
end
it 'creates all the supporters with correct organizations' do
orgs = Supporter.pluck(:organization)
expect(orgs).to match_array ['Jet-Pep', 'Klein Drug Shoppe, Inc.', 'River City Equipment Rental and Sales', 'Somewhere LLC']
end
it 'creates all the supporters with correct cities' do
cities = Supporter.pluck(:city)
expect(cities).to match_array ['Decatur', 'Guntersville', 'Holly Pond', 'Snead']
end
it 'creates all the supporters with correct addresses' do
addresses = Supporter.pluck(:address)
expect(addresses).to match_array(['3370 Alabama Highway 69', '649 Finley Island Road', 'P.O. Box 143', 'P.O. Box 611'])
end
it 'creates all the supporters with correct zip_codes' do
zips = Supporter.pluck(:zip_code)
expect(zips).to match_array(['35601', '35806', '35952', '35976'])
end
it 'creates all the supporters with correct state_codes' do
states = Supporter.pluck(:state_code)
expect(states).to match_array(['AL', 'AL', 'AL', 'AL'])
end
it 'creates all the donations with correct amounts' do
amounts = donations.map { |d| d['amount'] }
expect(amounts).to match_array([1000, 1000, 1000, 1000])
end
it 'creates all the donations with correct designations' do
desigs = donations.map { |d| d['designation'] }
expect(desigs).to match_array(['third party event', 'third party event', 'third party event', 'third party event'])
end
it 'inserts custom fields' do
vals = CustomFieldJoin.pluck(:value)
expect(vals).to match_array(['custfield', 'custfield', 'custfield', 'custfield'])
end
it 'inserts tags' do
names = TagJoin.joins(:tag_master).pluck("tag_masters.name")
expect(names).to match_array(%w[tag1 tag1 tag1 tag1 tag2 tag2 tag2 tag2])
end
end
end